Monday, May 07, 2012

Oxford and Cambridge

Following on from the last post, can anyone explain why Cambridge can't produce Prime Ministers and Oxford can't produce comedians, at least not professional ones?
And now for something a little bit different

Times Higher Education has announced that it will publish a ranking of new universities (less than fifty years old).

The Times Higher Education 100 Under 50 will – as its name suggests – rank the world’s top 100 universities under the age of 50. The table and analysis will be published online and as a special supplement to the magazine on 31 May, 2012.
The vast majority of the world’s top research-led universities have at least one thing in common: they are old. Building upon centuries of scholarly tradition, institutions such as the University of Oxford, which can trace its origins back to 1096, can draw on endowment income generated over many years and have been able to cultivate rich networks of loyal and successful alumni (including in Oxford’s case a string of British Prime Ministers) to help build enduring brands.
Deja Vu All Over Again

Malaysia's love-hate affair with international rankings has taken another twist. The official target now is get one university in the top 50 and three in the top 100 in the QS rankings. That basically means that a Malaysian university will have to be the equal of New South Wales, Tsinghua or Warwick.

Last year Universiti Malaya got into the top 500 in the Shanghai ARWU ranking. That is a solid achievement and it might mean more  if Malaysia could get another university there.


This is as part of its efforts to have a local university ranked among the world's top 50 universities by 2020.
Deputy Higher Education Minister Datuk Saifuddin Abdullah said the National Higher Education Strategic Plan also called for at least three local universities to be ranked among the world's top 100 universities.
To achieve this, he told the house that it needed to continuously recruit international students and participate in international education fairs to promote the "Education Malaysia" brand.
He was replying to Senator Mohd Khalid Ahmad who wanted to know why no local universities had been ranked among the world's top 200.
Saifuddin said the ministry was also intensifying promotional activities on the Internet and introducing student mobility programmes. This will allow them to take short-term courses with credits, and have better staff and student exchange programmes with foreign universities.
He said they were also having better scholarship coordination with foreign agencies and other bodies to facilitate the intake of foreign students at local universities.
He said the QS World University Ranking (QS WUR) was the preferred benchmark used to gauge a university.

What have I done?

I was recently in a public library somewhere in Southeast Asia. While browsing around I discovered that access to this blog was blocked because of  "other adult material".

I thought that perhaps someone was upset with university rankings in general, which is entirely understandable, but the THE, QS, ARWU, Webometrics and HEEACT sites were all unblocked.

My best guess is that the filter software interprets anything with "watch" in it as something to do with voyeurism. Or perhaps the post about "does size really matter?" was misunderstood.

Or perhaps it just means that this blog is very mature and sophisticated.

Sunday, May 06, 2012

Is this what they meant by diversity?

On December 1st. of last year I commented on proposals that the US News Law School Rankings should include an indicator for diversity.

Such proposals are based on the increasing globalisation of the world economy and the need to understand other cultures. It is obvious that middle class white Americans who support abortion, gay marriage, affirmative action, feminism and Obama must sit in classes with middle class African Americans who support abortion, gay marriage, affirmative action, feminism and Obama if they wish to communicate effectively with North Korean bureaucrats, supporters of Boko Haram who will soon control most of Nigeria, the Muslim  Brotherhood or the Haredim who will soon control Israel.

The story of Elizabeth Warren, a professor at Harvard Law School who claims 1/32 Native American ancestry (which is not necessarily the same as Native American identity)  shows the importance of diversity in law education. Were it not for her Cherokee great-great-great grandmother she would obviously be teaching something quite different to her students and so would render them unfit to compete in our diverse multicultural world.

Tuesday, May 01, 2012

Student Experience Survey

From Times Higher Education, the top ten British universities from the students' viewpoint.

1.   Dundee
2.   Loughborough
3    Sheffield
4.   Oxford
5.   Cambridge
6.   East Anglia
7.   Southampton
8.   Aberystwyth
9.   Glasgow
10.  Leeds

Monday, April 30, 2012

Thomson Reuters in Trouble?

The online financial newsletter StreetAuthority has published a list of 12 companies that are at risk because of a looming debt problem.

Among them is Thomson Reuters who, among other things, run the Web of Science database and provide the data for the Times Higher Education World University Rankings. Their data is also used by Shanghai Jiao Tong University's rankings to construct their Highly Cited Researchers and Publications indicators.

Thomson Reuters will not of course go bankrupt tomorrow. But if they are in trouble, then there could be implications for international university rankings.



During the past generation, a reasonable level of debt has always been seen as appropriate, because balance sheets were able to withstand a typical recession. Yet all that changed in 2008. GM's (NYSE: GM) debt load crashed the company, forcing it into bankruptcy, while many other companies such as GE (NYSE: GE), Ford Motor (NYSE: F), Hertz (NYSE: HTZ) and Domino's Pizza (NYSE: DPZ) saw their stocks plunge on fears a bankruptcy filing would be necessary if economic conditions worsened.

Thankfully, many companies wised up and have been taking steps to strengthen their balance sheets. But not everyone got the message. Some companies still carry too much debt and might run into trouble if the U.S. economy slips back into recession. These companies will need to make large payments to handle their debt, and right now they are at risk of not having enough cash to meet potential obligations. Typically, a company can simply roll over that debt and push out the time frame when debts come due. But a weak economy would make this task much harder as lenders grow skittish.

That's why it's so important to pay attention to balance sheets. Lots of debt is only a problem if the debts are soon coming due. For example, mattress maker Sealy Corp. (NYSE: ZZ) has a very weak balance sheet, with almost $800 million in debt and less than $100 million in cash. But management wisely rolled over its debt while it could, and now the company faces no major repayments until 2014.

But if a company's "current portion of long-term debt" -- that is, debts due within the next 12 months -- exceeds cash on hand, you need to listen to how management plans to address the problem because these companies could be at risk of failing. I went in search of companies that may have just such a problem (less cash than near-term loan obligations). I also added Canadian media firm Thomson Reuters (NYSE: TRI) to the mix because its weak balance sheet is just above that threshold. The table below highlights a group of companies that are at risk of having to declare bankruptcy in 2012 if their lenders are in no mood to extend them more loans.

Wednesday, April 25, 2012

LSE overtakes Oxford in British Ranking

The Complete University Guide has just published its rankings of British universities. According to Brendan O'Malley in University World News.




It is the first time since 2000 that Oxford and Cambridge have not shared the top two spots – in that year Imperial College London knocked Oxford into third place.


In separate listings for the leading universities and higher education institutions covering 62 subjects, Cambridge is in the top 10 for all 46 subjects it offers, and top in 30. Oxford is in the top 10 for all 32 of its subjects, and is placed first in 12. The LSE is in the top 10 for all 12 subjects offered, and top for three.


There are two new entrants to the Top 20 – the University of Glasgow (17th) and Leicester (19th). They have replaced Sussex, which just missed out in 21st place, and the School of Oriental and Asian Studies, which fell from 15th to 30th position.



O'Malley continues:


The interactive guide ranks universities on nine factors: student satisfaction; research assessment; entry standards; student-to-staff ratio; spending on academic services; spending on student facilities; good honours degrees achieved; graduate prospects; and completion.

The subject tables are based on four factors: student satisfaction; research assessment; entry standards; and graduate prospects.


When LSE offers 12 subjects and still beats Oxford with 32 subjects, these results need to be approached with caution. Bernard Kingston is honest enough to issue a health warning.

The good degrees and completion rates indicator are obvious incentives to game the system.
London Met Update

It seems that the proposal to ban alchohol from one campus was not quite what was reported. Apparently the vice-chancellor was looking for an excuse to save money. The latest report is that Muslim students are disassociating themselves from his plan.

Tuesday, April 24, 2012

Another Quality Initiative

Every so often education bureaucrats come up with bright ideas to boost quality and raise standards and proclaim their faith that students and teachers at all levels can perform as well as anybody anywhere.

A recent example is the decree by the the Director General of Higher Education at the Indonesian Education and Culture Ministry, that in order to graduate undergraduate students must publish a paper in a scientific journal, master's students in a national scientific journal and doctoral students in an international scientific journal.

It is easy to predict what will happen if this idea survives. Every university will set up a website and call it an  online journal to which  an essay, term paper or thought paper by every undergraduate or group of undergraduates will be uploaded, after running it through turnitin.

It would be almost as easy to set up national websites and call them national online scientific journals and publish term papers by masters students.

But getting papers into international journals will be another matter if the term international has any meaning and if the papers are to be single-author papers. We can expect a lot of creative workarounds here.
QS Paper 

Here are the links to a paper and presentation slides by Ben Sowter of QS that discusses the spin-offs: subject, regional and best student city rankings.
What about the Kindergarten Rankings?

US News will be publishing the Best High Schools rankings on May 1st.

Sunday, April 22, 2012

Does Not Compute

London Metropolitan University seems to be getting rather desperate in its attempts to attract students. On the one hand:


"A London university is considering establishing alcohol-free zones on its campuses because so many of its students consider drinking to be immoral.
Professor Malcolm Gillies, vice-chancellor of London Metropolitan University, said the selling of alcohol was an issue of "cultural sensitivity" at his institution where a fifth of students are Muslim.
Speaking to a conference of university administrators in Manchester, he said that for many students, drinking alcohol was "an immoral experience".
"Because there is no majority ethnic group [at London Metropolitan], I think [selling alcohol] is playing to particular parts of our society much more [than to others]," he was reported as saying in the Times Higher Education magazine."
On the other:

"As part of a master's course in events experience management, London Metropolitan University will offer a module in partnership with Chillisauce, known for organising custom stag dos across Europe.
The firm's website lists options including mud-wrestling with scantily clad women in Budapest, a "spa with strippers" in Riga, or the option to be "punished" at a Tallinn "lap dancing dreamland".
The link has drawn criticism from unions, although students will be involved only in the company's more straightforward commercial activities - including corporate dinners and conferences.
Participants on the course will be asked to create a Guinness World Record attempt that doubles as a PR event for a consumer brand. They will devise a "creative concept" and pitch it to Chillisauce executives, who will attend seminars and lectures during the module.
But a University and College Union spokesman questioned "how employing a company that specialises in stag weekends offering wrestling with scantily clad women in jelly is likely to do much for a university's reputation"."
Opening the Black Box

I have just seen an article by Mu-Hsuang Huan of National Taiwan University published by Research Evaluation, an Oxford University Press journal. Here is the abstract:

"In the era of globalization, the trend of university rankings gradually shifts from country-wide analyses to world-wide analyses. Relatively high analytical weightings on reputational surveys have led Quacquarelli Symonds (QS) World University Rankings to criticisms over the years. This study provides a comprehensive discussion of the indicators and weightings adopted in the QS survey. The article discusses several debates stirred in the academia on QS. Debates on this ranking system are presented in the study. Firstly, problems of return rate, as well as unequal distribution of returned questionnaires, have incurred regional bias. Secondly, some universities are listed in both domestic and international reputation questionnaires, but some others are listed only in the domestic part. Some universities were evaluated only by domestic respondents, limiting their performance of the ranking results. Thirdly, quite a few universities exhibit the same indicator scores or even full scores, rendering the assessment questionable. Lastly, enormous changes of single indicator scores suggest that the statistic data adopted by QS Rankings should be further questioned."

The article is useful and interesting, especially the table of rates of return for different countries. But it does not seem to go beyond what this blog and others have been saying for a long time.

This seems to be another case of mainstream academia lagging behind the mass media which in turn  is way behind the blogosphere.

Wednesday, April 18, 2012

The Euro-Rankings

The U-Multirank project has slowly crept past the feasibility study stage. While any attempt to undermine the triopoly of the big three global rankers, ARWU (Shanghai), QS, Times Higher Education, is welcome, it does seem to be taking a long time. One wonders whether it will get going before the Eurozone disintegrates. Still the project does have its defenders.


"Jordi Curell, director of lifelong learning, higher education and international affairs at the directorate general for education and culture, conceded that there was opposition to its development.

"When we started working on the project of U-Multirank, many people from the higher education community were opposed to it,” he told an international symposium on university rankings and quality assurance in Brussels on 12 April.

But the system had intrinsic value, he said, because it would provide an evidence-based measure of the performance of European universities that would help them improve.

According to Curell, if higher education is to help Europe emerge from its current financial and economic crisis, the EU needs to know how its universities are performing and universities need to know how they are doing.
"Rankings which are carefully thought out are the only transparency tools which can give a comparative picture of higher education institutions at a national, European and global level," he told the symposium."



There are critics of course. One of them is a committee of the British House of Lords which has argued that U-Multirank is a waste of money. Four million Euros sounds rather a lot but considering what the EU has spent money on, it is trivial.

And as for wasting the taxpayer's money, well, the committee could look at the other house and think about a  floating duck island, pantyliners, nappies, soundproofed bedrooms and so on and so on.

Tuesday, April 17, 2012

The Ranking Business

In INSIDE HIGHER ED, Kris Olds comments on the growing commercialisation of university rankings.

Saturday, April 14, 2012

US Faculty Salaries

The Chronicle of Higher Education features a report from the AAUP on faculty salaries, along with data on student faculty ratio. Predictably, Harvard, where the average fill professors gets $198,000 per annum, is at the top.

If anyone ever has the time, it ought to be  possible to work out the productivity of faculty with regard to publications, citations , students taught and graduated.

Thursday, April 12, 2012

News from Australia

There is now a website MyUniversity that allows anyone to compare Australian public universities for  attributes such as staff qualifications, student to staff ratio, graduate employment and so on.

Of course, the value of the site will be little better than the quality of the information uploaded. Perhaps someone could take a look at the site and see how accurate the data is.

According to University World News , the Australian faculty staff union is not happy.

'Rea said that if students were looking to base their choice of institution on whether a campus had an automatic teller machine, the site might be useful. But if they wanted an indication of the quality of teaching and research at any given institution, the information provided relied on a set of indicators that had been under question for many years.

The union had been critical for some time of the misuse of statistical data, such as graduate employment outcomes and student satisfaction results, in determining the quality of learning and teaching. Yet these were included as measurable indicators of quality by the website.

“The use of student satisfaction scores in particular is prone to manipulation and does not reflect quality in teaching. Indeed, if institutions based their courses on whether students liked their subjects, which is essentially what these metrics capture, they would risk driving down the quality of degrees from Australian universities.

“There is always a danger of teaching to the test – or the survey, in this case,” Rea said.

She said the diversity of Australian universities made it difficult to attempt any comparisons. Although the union believed students should be able to make an informed choice of where best to study, it should be just that – an informed choice based on accurate, clear and transparent information.

“This can only happen if the indicators or measures used to create this information are specific, widely understood and agreed, and incapable of institutional manipulation.” '

Tuesday, April 10, 2012

Ranking News: QS World University  Rankings

Like many others, I have just received an invitation to take in the QS academic survey which contributes 40 per cent to their World University Rankings. If I understand it correctly, QS now get the majority of their respondents from the Mardev mailing lists. That means that subscribing to an academic journal is sufficient qualification to rank the research capabilities of universities around the world.

The Times Higher Education World University Rankings also have a reputation survey. It is a little more difficult to join this club. To be a respondent it is necessary to have written (or co-written or perhaps "co-written" (these days the concept of authorship is getting quite blurred) an article in an ISI-indexed journal. On balance, the THE survey is likely to more reliable although I think QS could argue that those who read academic journals but do little research are a constituency whose views should be considered.
Ranking News: US News

Robert Morse has announced that US News has begun to collect data for the 2013 US college rankings.

"We recently started collecting the statistical data that will be used for the 2013 edition of our college rankings, which will be published later this year. Data collection for the three U.S. News statistical surveys—the main one, financial aid, and finance—began on March 27.

These surveys gather information on such factors as enrollment, faculty, tuition, room and board, SAT and ACT scores, admissions criteria, graduation and retention rates, college majors, faculty, school finances, activities, sports, and financial aid. This data is used in the Best Colleges rankings that will be published on usnews.com and in the print guidebook that will be available on newsstands."

Monday, March 26, 2012

The University Challenge Rankings

The  quiz show, University Challenge, provides a plausible supplement to the established British league tables. If we award 2 points for winning and one for being runner up (I confess this is from Wikipedia) then we get this ranking:

1.      Oxford                                                             39
2.      Cambridge                                                       21
3.      Manchester                                                        8
4=.    Imperial College London                                  5
4=.    Open University                                                5
6=.    Durham                                                             4
6=.    Sussex                                                               4
8=.    St. Andrews                                                      3
8=.    Birkbeck College, University of London         3
10=.  Bradford                                                            2
10=.  Dundee                                                              2
10=.  Keele                                                                 2
10=.  Leicester                                                            2
10=.  Belfast                                                               2
10=.  Warwick                                                            2   
16 =  Lancaster                                                           2
16=.  LSE                                                                   1
16=.  Cranfield                                                           1
16=.  Sheffield                                                            1
16=.  York                                                                   1

Bristol, Edinburgh and University College are not there at all and LSE does not perform very well.

The show inspired an Indian version and in one memorable "cup winners' cup" final Sardar Patel College of Engineering beat Gonville and Caius, Cambridge.

Apologies for the weird spacing. I am looking up how to do tables in blogspot.

Tuesday, March 20, 2012

Comments on the THE Reputation Rankings

 From Kris Olds at GlobalHigherEd

The 2012 Times Higher Education (THE) World Reputation Rankings were released at 00.01 on 15 March by Times Higher Education via its website. It was intensely promoted via Twitter by the ‘Energizer Bunny’ of rankings, Phil Baty, and will be circulated in hard copy format to the magazine’s subscribers. As someone who thinks there are more cons than pros related to the rankings phenomenon, I could not resist examining the outcome, of course! See below and to the right for a screen grab of the Top 20, with Harvard demolishing the others in the reputation standings.

I do have to give Phil Baty and his colleagues at Times Higher Education and Thomson Reuters credit for enhancing the reputation rankings methodology. Each year their methodology gets better and better.

From Alex Usher at Higher Education Strategy Associates

There actually is a respectable argument to be made for polling academics about “best” universities. Gero Federkeil of the Centrum für Hochschulentwicklung in Gütersloh noted a few years ago that if you ask professors which institution in their country is “the best” in their field of study, you get a .8 correlation with scholarly publication output. Why bother with tedious mucking around with bibliometrics when a survey can get you the same thing?

Two reasons, actually. One is that there’s no evidence this effect carries over to the international arena (could you name the best Chinese university in your discipline?) and second is that there’s no evidence it carries over beyond an academic’s field of study (could you name the best Canadian university for mechanical engineering?).

So, while the Times makes a big deal about having a globally-balanced sample frame of academics (and of having translated the instrument into nine languages), the fact that it doesn’t bother to tell us who actually answered the questionnaire is a problem. Does the fact that McGill and UBC do better on this survey than on more quantitatively-oriented research measures have to do with abnormally high participation rates among Canadian academics? Does the fact that Waterloo fell out of the top 100 have to do with the fact that fewer computer scientists, engineers and mathematicians responded this year? In neither case can we know for sure.

Sunday, March 18, 2012

The THE Reputation Rankings

Times Higher Education has produced its third reputation ranking based on a survey of researchers published in ISI indexed journals. The top ten are:

1.  Harvard
2.  MIT
3.  Cambridge
4.  Stanford
5.  UC Berkeley
6.  Oxford
7.  Princeton
8.  Tokyo
9.  UCLA
10. Yale

This does not look all that dissimilar to the academic survey indicator in the  2011 QS World University Rankings. The top ten there is as follows:

1.  Harvard
2.  Cambridge
3.  Oxford
4.  UC Berkeley
5.  Stanford
6.  MIT
7.  Tokyo
8.  UCLA
9. Princeton
10. Yale

Once we step outside the top ten there are some differences. The National University of Singapore is 23rd in these rankings but 11th in the QS academic survey, possibly because QS still has respondents on its list from the time when it used World scientific, a Singapore based publishing company.

The Middle East Technical University in Ankara is in the top 100 (In the QS academic survey it is not even in the top 300), sharing the 90-100 band with Ecole Polytechnique, Bristol and Rutgers. At first glance this seems surprising since its research output is exceeded by other universities in the Middle East. But the technical excellence of its University Ranking by Academic Performance suggests that its research might be of a high quality.

Sunday, March 11, 2012

Law School Rankings

Anyone interested in the current arguments about American law school rankings might visit the blog of Lawyers against the Law School Scam.
University Ranking in Pakistan

University World News has an article on the ranking of Pakistani universities by the country's Higher Education Commission (HEC). According to Ameen Amjad Khan:

According to the Higher Education Commission (HEC) ranking, Islamabad’s Quaid-e-Azam University tops 136 public and private sector institutions, followed by the Pakistan Institute of Engineering and Applied Sciences, with Karachi’s Agha Khan University in third place.

Academics from the University of Karachi and the University of Peshawar have rejected the ranking, which does not place either institution in the top 10.

They have accused the HEC of tampering with the standard formula to favour some institutions and have demanded that their vice-chancellors formally convey their disapproval to HEC bosses.

Faculty members of Hyderabad’s Liaquat University of Medical and Health Sciences have even warned that they will take the matter to the court if the ranking is not revoked. They said in a statement on 29 February: “The HEC announced the rankings in haste and caused chaos in both public and private higher education institutions.”

It seems that the rankings were based on a modified of the QS World University Rankings. Among the modifications were the introduction of indicators based on the number of journals published and the number of grants from the HEC.