Sunday, April 22, 2012

Does Not Compute

London Metropolitan University seems to be getting rather desperate in its attempts to attract students. On the one hand:


"A London university is considering establishing alcohol-free zones on its campuses because so many of its students consider drinking to be immoral.
Professor Malcolm Gillies, vice-chancellor of London Metropolitan University, said the selling of alcohol was an issue of "cultural sensitivity" at his institution where a fifth of students are Muslim.
Speaking to a conference of university administrators in Manchester, he said that for many students, drinking alcohol was "an immoral experience".
"Because there is no majority ethnic group [at London Metropolitan], I think [selling alcohol] is playing to particular parts of our society much more [than to others]," he was reported as saying in the Times Higher Education magazine."
On the other:

"As part of a master's course in events experience management, London Metropolitan University will offer a module in partnership with Chillisauce, known for organising custom stag dos across Europe.
The firm's website lists options including mud-wrestling with scantily clad women in Budapest, a "spa with strippers" in Riga, or the option to be "punished" at a Tallinn "lap dancing dreamland".
The link has drawn criticism from unions, although students will be involved only in the company's more straightforward commercial activities - including corporate dinners and conferences.
Participants on the course will be asked to create a Guinness World Record attempt that doubles as a PR event for a consumer brand. They will devise a "creative concept" and pitch it to Chillisauce executives, who will attend seminars and lectures during the module.
But a University and College Union spokesman questioned "how employing a company that specialises in stag weekends offering wrestling with scantily clad women in jelly is likely to do much for a university's reputation"."
Opening the Black Box

I have just seen an article by Mu-Hsuang Huan of National Taiwan University published by Research Evaluation, an Oxford University Press journal. Here is the abstract:

"In the era of globalization, the trend of university rankings gradually shifts from country-wide analyses to world-wide analyses. Relatively high analytical weightings on reputational surveys have led Quacquarelli Symonds (QS) World University Rankings to criticisms over the years. This study provides a comprehensive discussion of the indicators and weightings adopted in the QS survey. The article discusses several debates stirred in the academia on QS. Debates on this ranking system are presented in the study. Firstly, problems of return rate, as well as unequal distribution of returned questionnaires, have incurred regional bias. Secondly, some universities are listed in both domestic and international reputation questionnaires, but some others are listed only in the domestic part. Some universities were evaluated only by domestic respondents, limiting their performance of the ranking results. Thirdly, quite a few universities exhibit the same indicator scores or even full scores, rendering the assessment questionable. Lastly, enormous changes of single indicator scores suggest that the statistic data adopted by QS Rankings should be further questioned."

The article is useful and interesting, especially the table of rates of return for different countries. But it does not seem to go beyond what this blog and others have been saying for a long time.

This seems to be another case of mainstream academia lagging behind the mass media which in turn  is way behind the blogosphere.

Wednesday, April 18, 2012

The Euro-Rankings

The U-Multirank project has slowly crept past the feasibility study stage. While any attempt to undermine the triopoly of the big three global rankers, ARWU (Shanghai), QS, Times Higher Education, is welcome, it does seem to be taking a long time. One wonders whether it will get going before the Eurozone disintegrates. Still the project does have its defenders.


"Jordi Curell, director of lifelong learning, higher education and international affairs at the directorate general for education and culture, conceded that there was opposition to its development.

"When we started working on the project of U-Multirank, many people from the higher education community were opposed to it,” he told an international symposium on university rankings and quality assurance in Brussels on 12 April.

But the system had intrinsic value, he said, because it would provide an evidence-based measure of the performance of European universities that would help them improve.

According to Curell, if higher education is to help Europe emerge from its current financial and economic crisis, the EU needs to know how its universities are performing and universities need to know how they are doing.
"Rankings which are carefully thought out are the only transparency tools which can give a comparative picture of higher education institutions at a national, European and global level," he told the symposium."



There are critics of course. One of them is a committee of the British House of Lords which has argued that U-Multirank is a waste of money. Four million Euros sounds rather a lot but considering what the EU has spent money on, it is trivial.

And as for wasting the taxpayer's money, well, the committee could look at the other house and think about a  floating duck island, pantyliners, nappies, soundproofed bedrooms and so on and so on.

Tuesday, April 17, 2012

The Ranking Business

In INSIDE HIGHER ED, Kris Olds comments on the growing commercialisation of university rankings.

Saturday, April 14, 2012

US Faculty Salaries

The Chronicle of Higher Education features a report from the AAUP on faculty salaries, along with data on student faculty ratio. Predictably, Harvard, where the average fill professors gets $198,000 per annum, is at the top.

If anyone ever has the time, it ought to be  possible to work out the productivity of faculty with regard to publications, citations , students taught and graduated.

Thursday, April 12, 2012

News from Australia

There is now a website MyUniversity that allows anyone to compare Australian public universities for  attributes such as staff qualifications, student to staff ratio, graduate employment and so on.

Of course, the value of the site will be little better than the quality of the information uploaded. Perhaps someone could take a look at the site and see how accurate the data is.

According to University World News , the Australian faculty staff union is not happy.

'Rea said that if students were looking to base their choice of institution on whether a campus had an automatic teller machine, the site might be useful. But if they wanted an indication of the quality of teaching and research at any given institution, the information provided relied on a set of indicators that had been under question for many years.

The union had been critical for some time of the misuse of statistical data, such as graduate employment outcomes and student satisfaction results, in determining the quality of learning and teaching. Yet these were included as measurable indicators of quality by the website.

“The use of student satisfaction scores in particular is prone to manipulation and does not reflect quality in teaching. Indeed, if institutions based their courses on whether students liked their subjects, which is essentially what these metrics capture, they would risk driving down the quality of degrees from Australian universities.

“There is always a danger of teaching to the test – or the survey, in this case,” Rea said.

She said the diversity of Australian universities made it difficult to attempt any comparisons. Although the union believed students should be able to make an informed choice of where best to study, it should be just that – an informed choice based on accurate, clear and transparent information.

“This can only happen if the indicators or measures used to create this information are specific, widely understood and agreed, and incapable of institutional manipulation.” '

Tuesday, April 10, 2012

Ranking News: QS World University  Rankings

Like many others, I have just received an invitation to take in the QS academic survey which contributes 40 per cent to their World University Rankings. If I understand it correctly, QS now get the majority of their respondents from the Mardev mailing lists. That means that subscribing to an academic journal is sufficient qualification to rank the research capabilities of universities around the world.

The Times Higher Education World University Rankings also have a reputation survey. It is a little more difficult to join this club. To be a respondent it is necessary to have written (or co-written or perhaps "co-written" (these days the concept of authorship is getting quite blurred) an article in an ISI-indexed journal. On balance, the THE survey is likely to more reliable although I think QS could argue that those who read academic journals but do little research are a constituency whose views should be considered.
Ranking News: US News

Robert Morse has announced that US News has begun to collect data for the 2013 US college rankings.

"We recently started collecting the statistical data that will be used for the 2013 edition of our college rankings, which will be published later this year. Data collection for the three U.S. News statistical surveys—the main one, financial aid, and finance—began on March 27.

These surveys gather information on such factors as enrollment, faculty, tuition, room and board, SAT and ACT scores, admissions criteria, graduation and retention rates, college majors, faculty, school finances, activities, sports, and financial aid. This data is used in the Best Colleges rankings that will be published on usnews.com and in the print guidebook that will be available on newsstands."

Monday, March 26, 2012

The University Challenge Rankings

The  quiz show, University Challenge, provides a plausible supplement to the established British league tables. If we award 2 points for winning and one for being runner up (I confess this is from Wikipedia) then we get this ranking:

1.      Oxford                                                             39
2.      Cambridge                                                       21
3.      Manchester                                                        8
4=.    Imperial College London                                  5
4=.    Open University                                                5
6=.    Durham                                                             4
6=.    Sussex                                                               4
8=.    St. Andrews                                                      3
8=.    Birkbeck College, University of London         3
10=.  Bradford                                                            2
10=.  Dundee                                                              2
10=.  Keele                                                                 2
10=.  Leicester                                                            2
10=.  Belfast                                                               2
10=.  Warwick                                                            2   
16 =  Lancaster                                                           2
16=.  LSE                                                                   1
16=.  Cranfield                                                           1
16=.  Sheffield                                                            1
16=.  York                                                                   1

Bristol, Edinburgh and University College are not there at all and LSE does not perform very well.

The show inspired an Indian version and in one memorable "cup winners' cup" final Sardar Patel College of Engineering beat Gonville and Caius, Cambridge.

Apologies for the weird spacing. I am looking up how to do tables in blogspot.

Tuesday, March 20, 2012

Comments on the THE Reputation Rankings

 From Kris Olds at GlobalHigherEd

The 2012 Times Higher Education (THE) World Reputation Rankings were released at 00.01 on 15 March by Times Higher Education via its website. It was intensely promoted via Twitter by the ‘Energizer Bunny’ of rankings, Phil Baty, and will be circulated in hard copy format to the magazine’s subscribers. As someone who thinks there are more cons than pros related to the rankings phenomenon, I could not resist examining the outcome, of course! See below and to the right for a screen grab of the Top 20, with Harvard demolishing the others in the reputation standings.

I do have to give Phil Baty and his colleagues at Times Higher Education and Thomson Reuters credit for enhancing the reputation rankings methodology. Each year their methodology gets better and better.

From Alex Usher at Higher Education Strategy Associates

There actually is a respectable argument to be made for polling academics about “best” universities. Gero Federkeil of the Centrum für Hochschulentwicklung in Gütersloh noted a few years ago that if you ask professors which institution in their country is “the best” in their field of study, you get a .8 correlation with scholarly publication output. Why bother with tedious mucking around with bibliometrics when a survey can get you the same thing?

Two reasons, actually. One is that there’s no evidence this effect carries over to the international arena (could you name the best Chinese university in your discipline?) and second is that there’s no evidence it carries over beyond an academic’s field of study (could you name the best Canadian university for mechanical engineering?).

So, while the Times makes a big deal about having a globally-balanced sample frame of academics (and of having translated the instrument into nine languages), the fact that it doesn’t bother to tell us who actually answered the questionnaire is a problem. Does the fact that McGill and UBC do better on this survey than on more quantitatively-oriented research measures have to do with abnormally high participation rates among Canadian academics? Does the fact that Waterloo fell out of the top 100 have to do with the fact that fewer computer scientists, engineers and mathematicians responded this year? In neither case can we know for sure.

Sunday, March 18, 2012

The THE Reputation Rankings

Times Higher Education has produced its third reputation ranking based on a survey of researchers published in ISI indexed journals. The top ten are:

1.  Harvard
2.  MIT
3.  Cambridge
4.  Stanford
5.  UC Berkeley
6.  Oxford
7.  Princeton
8.  Tokyo
9.  UCLA
10. Yale

This does not look all that dissimilar to the academic survey indicator in the  2011 QS World University Rankings. The top ten there is as follows:

1.  Harvard
2.  Cambridge
3.  Oxford
4.  UC Berkeley
5.  Stanford
6.  MIT
7.  Tokyo
8.  UCLA
9. Princeton
10. Yale

Once we step outside the top ten there are some differences. The National University of Singapore is 23rd in these rankings but 11th in the QS academic survey, possibly because QS still has respondents on its list from the time when it used World scientific, a Singapore based publishing company.

The Middle East Technical University in Ankara is in the top 100 (In the QS academic survey it is not even in the top 300), sharing the 90-100 band with Ecole Polytechnique, Bristol and Rutgers. At first glance this seems surprising since its research output is exceeded by other universities in the Middle East. But the technical excellence of its University Ranking by Academic Performance suggests that its research might be of a high quality.

Sunday, March 11, 2012

Law School Rankings

Anyone interested in the current arguments about American law school rankings might visit the blog of Lawyers against the Law School Scam.
University Ranking in Pakistan

University World News has an article on the ranking of Pakistani universities by the country's Higher Education Commission (HEC). According to Ameen Amjad Khan:

According to the Higher Education Commission (HEC) ranking, Islamabad’s Quaid-e-Azam University tops 136 public and private sector institutions, followed by the Pakistan Institute of Engineering and Applied Sciences, with Karachi’s Agha Khan University in third place.

Academics from the University of Karachi and the University of Peshawar have rejected the ranking, which does not place either institution in the top 10.

They have accused the HEC of tampering with the standard formula to favour some institutions and have demanded that their vice-chancellors formally convey their disapproval to HEC bosses.

Faculty members of Hyderabad’s Liaquat University of Medical and Health Sciences have even warned that they will take the matter to the court if the ranking is not revoked. They said in a statement on 29 February: “The HEC announced the rankings in haste and caused chaos in both public and private higher education institutions.”

It seems that the rankings were based on a modified of the QS World University Rankings. Among the modifications were the introduction of indicators based on the number of journals published and the number of grants from the HEC. 
International Patent Filing

The World International Property Organization has issued a report on the filing of patents. The top university is the University of California followed by MIT, University of Texas, Johns Hopkins and the Korea Institute of Advanced Technology.

Tuesday, March 06, 2012

O, My Felony is Rank

The Social Science  Research Network has just produced a paper by Morgan Cloud and George Shepherd, 'Law Deans in Jail'. Notice the absence of a question mark. Here is the abstract:

A most unlikely collection of suspects - law schools, their deans, U.S. News
& World Report and its employees - may have committed felonies by publishing
false information as part of U.S. News' ranking of law schools. The possible
federal felonies include mail and wire fraud, conspiracy, racketeering, and
making false statements. Employees of law schools and U.S. News who
committed these crimes can be punished as individuals, and under federal law
the schools and U.S. News would likely be criminally liable for their agents'
crimes. Some law schools and their deans submitted false information about the
schools' expenditures and their students' undergraduate grades and LSAT
scores. Others submitted information that may have been literally true but was
misleading. Examples include misleading statistics about recent graduates'
employment rates and students' undergraduate grades and LSAT scores. U.S.
News itself may have committed mail and wire fraud. It has republished, and
sold for profit, data submitted by law schools without verifying the data's
accuracy, despite being aware that at least some schools were submitting false
and misleading data. U.S. News refused to correct incorrect data and rankings
errors and continued to sell that information even after individual schools
confessed that they had submitted false information. In addition, U.S. News
marketed its surveys and rankings as valid although they were riddled with
fundamental methodological errors.

It is unlikely that we will ever see law school deans in jail. It seems that it would be necessary to show that there is a connection between the submission or publication (or failure to retract publication) of false or misleading information and monetary gain and that might be rather difficult to prove.

Also, the authors of the paper are from Emory University, a middling institution. If it ever did come to criminal proceedings you can bet that the big law schools would be lined up behind the rankings.

Monday, March 05, 2012

Best Student Cities

QS has just announced its Best Student Cities ranking. The criteria are performance in the QS World university rankings, student mix, quality of living, employer activity and affordability.

To be included a city must have a population of at least 250,000 and at least two institutions in the QS rankings That explains why Oxford and Cambridge are not on the list but Cairo and Santiago are. 
The top five are:

1.  Paris
2.  London
3.  Boston
4.  Melbourne
5.  Vienna

Saturday, February 18, 2012

Light Posting Ahead

For the next few weeks posting will be light as I am attending to family matters.

Wednesday, February 08, 2012

Eight years of ranking: What have we learned?

My post at the University World News World Blog can be viewed here.

Wednesday, February 01, 2012

University Ranking by Academic Performance

Perhaps too much attention is given to the big three international rankings -- ARWU (the Shanghai Rankings), the QS World University Rankings and the Times Higher Education World University Rankings.

So, here is what I hope will be the first of  a number of posts about the other rankings that have appeared recently.

The University Ranking of Academic Performance is produced by the Informatics Institute of the Middle East Technical University in Ankara.

The indicators are as follows:

Number of articles -- measures current scientific productivity
Google scholar results -- measures long term overall productivity
Citations -- measures research impact
Cumulative Journal Impact -- measures scientific impact
H- index --- measures research quality
International collaboration -- measures international acceptance


These are the top ten:

1.  Harvard
2.  Toronto
3.  Johns Hopkins
4.  Stanford

5.  UC Berkeley
6.  Tokyo
7.  Michigan, Ann Arbor
8.  Washington, Seattle
9.  UCLA
10. Oxford
Think Tank Ranking

There seems to be little that cannot be ranked these days. In the UK there are primary school league tables although nowhere in the world is there a kindergarten ranking. Yet.

The University of Pennsylvania has produced a report on think tanks. These are apparently flourishing everywhere: even Andorra has one.

Here are the top five world wide:

1.  Brookings Institution, USA

2.  Chatham House Royal Institute of International Affairs, UK

3.  Carnegie Endowment for International Peace, USA

4.  Council on Foreign Relations, USA

5.  Center for Strategic and International Studies, USA

Thursday, January 26, 2012

Guest Post

Today's post is by Masturah Alatas. It is in response to a comment in the Kuala Lumpur New Straits Times by James Campbell that begins:

"Any discussion of Malaysian tertiary educational policy needs to take into account the needs of national development in a specific and historical context. Recent debates in regard to the competitive position of Malaysian higher education globally is one area where the pressures of competition and liberalisation must be balanced by the interests of inclusion and social sustainability."


Over the last few years there has been an ongoing debate between those Malaysian academics who accept the challenge of globalisation  and those who more concerned with, as Campbell puts it, "ensuring national values, addressing horizontal social inequality, rural disadvantage and looking into the needs of sustainable and inclusive economic and social development".

The comment continued by invoking the name of Syed Hussein Alatas, a Malaysian social and political theorist who had considerable  international influence, being cited by, among others, Edward Said.



"The discourse of neo-liberal globalisation is itself still arguably beholden to what Syed Hussein Alatas critiqued as the discourse of “The Lazy Native”. Higher educational institutions’ commitment to inclusion and social justice is central to their merit in society."


The following is a response to this comment by the daughter and biographer of Alatas.

Finding the clarity

by Masturah Alatas

As biographer of late Malaysian sociologist Syed Hussein Alatas (1928 – 2007), I do not consider his life story to end with the end of his life. Any continuing narrative about Alatas also has to take into account how, for example, he is talked about in the media today.

As a case in point, I refer to the article Finding the Balance (Learning Curve, New Straits Times, 20 November 2011) by Dr James Campbell, a lecturer in Education at Deakin University, Australia, and researcher with Universiti Sains Malaysia.

Even though the article carries a moribund photo of Alatas, it contains only one sentence in direct reference to him. And the sentence is difficult to understand. “The discourse of neo-liberal globalization is itself still arguably beholden to what Syed Hussein Alatas critiques as the discourse of The Lazy Native”.

The article does not explain what, if at all, neo-liberal globalization has to do with “the discourse of The Lazy Native”. And nowhere in the article is it clearly stated that many of Malaysia’s current higher education policies are neo-liberal to begin with. So Campbell is vague. He seems to be criticizing neo-liberalism in general, but not what is specifically neo-liberal about Malaysia’s higher education policies.

Campbell argues that comparisons between the National University of Singapore and Universiti Malaya may not always be valid “given important distinctions and differences in national policies and political culture between the two nations.”

But how does this reasoning square with the fact that a Malaysian sociologist like Syed Hussein Alatas taught at the National University of Singapore for over twenty years, and was appointed Head of Department of Malay studies there? What criteria of merit did NUS apply when they tenured Syed Hussein Alatas? Is Campbell suggesting that the same criteria cannot be applied in a Malaysian university? And if so, why not?
Malaysians may still remember Syed Hussein Alatas’ short-lived, controversial term (1988 – 1991) as Vice-Chancellor of Universiti Malaya when he tried to promote academic staff based on merit. For Alatas, one way to establish merit was to look at the publications of academics. The New Straits Times itself carried reports of the controversy, so it is no secret. Some of them contained statements like “five members of the students association have come out in support of Vice-Chancellor Syed Hussein Alatas’s stand on the appointment of non-Malay deans to faculties in the university” (NST, 12 March 1990).

When Campbell writes that any discussion of Malaysia’s higher education policies “needs to be placed in perspective against the needs of national development in a specific historical context”, and that notions of merit must take into account ideas of inclusion and social justice, what exactly does he mean? Inclusion, of course, necessarily entails exclusion. But who is being excluded and on what grounds are they being excluded? And whose sense of social justice are we talking about here?

Syed Hussein Alatas’ works from The Myth of the Lazy Native to his writings on corruption precisely warn against the dangers of relativising notions of social justice. So it is quite odd that Campbell would refer to Alatas in his article.

All written legacies can be appropriated, rightly and wrongly, to support a particular persuasion or agenda, and it depends on critics to call attention to what is right or wrong appropriation. One of the best writers I know, for example, who has creatively applied Syed Hussein Alatas’ ideas on mental captivity and the inability to raise original problems to the role of education and the Arts, is former New Straits Times columnist U-En Ng. “You can use entertainment to pull wool over people’s eyes and divert attention away from whatever it is you don’t want them to see or think about. Or, more positively, you can use artistic expression to build civic participation and the capacity to raise original problems,” he writes in the article Governing through the Arts (NST, 09 January 2011). “Rebellious performance art by university students tells you both what you can expect from the current education system as well as how the public might react to new ideas.”

At the same time, a young Italian poet from Osimo by the name of Andrea Palazzo reminds us that an excess of “the so-called need to express oneself can be the mortal enemy of Beauty and Truth. Overshadowed by popular opinion, great art dies, or rather it disappears, or languishes in museums.” And for Palazzo, the prospects for Philosophy in some Italian universities, which he feels are “conservative rather than selective”, are not much brighter either.

Syed Hussein Alatas believed that any process of change must necessarily be accompanied by a philosophical set of criteria for selection of what to reject, retain and strive for (see Erring Modernization, 1975). Quoting from German philosopher Friedrich Nietzsche’s The Use and Abuse of History, Alatas stressed the importance of having a horizon of thinking, “a line that divides the visible and the clear from the vague and shadowy” in order for an individual, a community and a nation to thrive. “We must know the right time to forget as well as the right time to remember, and instinctively see when it is necessary to feel historically and when unhistorically.”

It is, of course, extremely difficult to know when and what to remember and forget, especially when the spectre of Malaysia’s 1969 May 13 Racial Riots still haunts us, and when Malaysia is making itself dizzy by rushing to become a High Income Nation before eight years are up.


But one way for a line, a horizon of thinking, to become clearer is through good writing. And this may just be what will rescue Alatas’ work from languishing in the shadows.



Masturah Alatas


Sunday, January 22, 2012

Worth Reading

Ranking in Higher Education: Its Place and Impact by Jan Sadlak. Originally appeared in the Europa World of Learning 2010.

Wednesday, January 18, 2012

The Power of Small Numbers

My attention has just been drawn to the citations per paper indicator in the 2011 QS Asian University Rankings. In first place is University of Santo Tomas in the Philippines, a very good school in some ways but not usually considered as a research rival to Hong Kong or Tokyo. It seems that UST's success was the result of just one much cited medical paper of which just one UST researcher was a co-author.

 Another highly cited many-authored medical paper seems to explain Universitas Padjadjaran's appearance in sixth place in this indicator even though the total number of papers is extremely small.

Leiden University have started offering fractional counting of publications in their rankings:

The fractional counting method gives less weight to collaborative publications than to non-collaborative ones. For instance, if the address list of a publication contains five addresses and two of these addresses belong to a particular university, then the publication has a weight of 0.4 in the calculation of the bibliometric indicators for this university. The fractional counting method leads to a more proper normalization of indicators and to fairer comparisons between universities active in different scientific fields. Fractional counting is therefore regarded as the preferred counting method in the Leiden Ranking.

This would be one way of avoiding giving a high position to universities that produce little but manage to get researchers included as co-authors a few papers, usually in medical journals,
FindThe Best

This is a site that has just come to my attention. There is a great variety of rankings of things like antivirus, snowboards and fertility clinics and also of colleges and universities, including business, law and medical schools.

The colleges and universities ranking is US only and includes a "smart ranking" combining  statistical information with the Forbes, US News and ARWU (Shanghai) rankings. This sounds like a good idea but there does not seem to be any information about the methodology.

Tuesday, January 17, 2012

The Happiest University in Britain?

According to the Daily Telegraph it's St Mary's University College Belfast. I thought Belfast was in Ireland but then again I did not do geography in secondary school.