Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Tuesday, July 29, 2008
The US college scene is buzzing about the latest ranking news. No, its not the impending THE-QS rankings but Princeton Review's list of top party schools. See here, here, here, here, here, here, here, here, here and here.
It seems that US colleges are more concerned about being in the top 20 party schools or, in a few cases, in the top 20 stone-cold sober schools than their position in the THE-Qs rankings. And who is to say that they they are wrong.
The Princeton Review's Annual College Rankings, based on a survey of 120,000 US students, is now out. There are 62 lists and there seems to be something for everyone. Here are some of the top schools.
Middlebury College -- best professors
Wheaton College -- tastiest campus food
Loyola College -- best dorms
Yale -- most beautiful campus
Northeastern University -- best career and job placement service
Stanford -- best classroom experience
Texas A and M (College Station) -- most conservative students
Occidental College -- most liberal students
University of Florida (Gainesville) -- top party school
Brigham Young University -- top stone-cold sober school
City University of new York -- most diverse student body
University of Maryland at College Part -- best athletic facilities
Monday, July 28, 2008
The Ecole des Mines de Paris has produced a new ranking based on the number of corporate leaders trained by universities. Its website reports:
The École des Mines de Paris ranking is based on a quite different
criterion: the number of alumni holding a Chief Executive Officer position (CEO)
in one of the 500 leading worldwide companies as of the date of the Shanghai
ranking 2006. This criterion is aimed at being the equivalent among companies of
the criterion for alumni who have been awarded the Nobel Prize or the Fields
Medal, as the numbers involved are similar.
We have chosen to identify the
500 leading enterprises on the basis of the "Global Fortune 500", based on the
criterion of the published annual turnover and conducted by Fortune
magazine.
It is good to be reminded that universities do other things and not just research. Still this is a very limited measure of excellence.
Here are the top ten
1. Harvard Univ
2. Tokyo Univ
3 Stanford Univ
4 Ecole Polytechnique Paris
5 École des Hautes Études Commerciales Paris
6 Univ Pennsylvania
7 Massachusetts Inst Tech (MIT)
8 Sciences Po - Paris
9 ENA Paris
10 Ecole des Mines de Paris
There are five French schools in the top ten which may say something about the excellence of these institutions or perhaps something about French cultural introversion. Only four foreign institutions have contributed to the training of 37 French CEOs. For the 38 British CEOs the corresponding figure is 22, including Gettysburg College, the US Naval Academy, University College Cork, Utrecht, Witwatersrand and, of course, the Ecole des Mines.
I noticed that in at least two cases, Strode's College in Surrey and Malay College in Malaysia, secondary schools were listed as diploma-granting institutions.
Monday, July 21, 2008
BBC News reports that graduates of lower ranking universities are more dishonest than those from institutions further up the ranking ladder.
"Analysis of 3,876 financial service job applications found embellishments on the forms of 43% of applicants from the UK's lowest ranking universities.
Only 14% of applicants from the top 20 UK universities were found to have fibbed in their applications.
The survey was commissioned by a pre-employment screening firm, Powerchex.
Its managing director Alexandra Kelly said: "What this survey says is that graduates from lesser-known universities may feel the need to alter their background to compete."
There is no definitive ranking of universities. For the survey the researchers at the Shell technology and enterprise programme used the Times Online 2009 ranking. "
Given the amount of soft marking and numbers massaging that British universities do to get a good place in the rankings, is it posible that graduates of the good ones do not need to embellish their CVs because their universities do all the necessary embellishment for them?
Friday, July 18, 2008
One consequence of the ranking craze is the proliferation of "minor leagues" as universities and countries that do badly on prominent rankings encourage the creation of more and more indexes and league tables on which they hope to better.
The French Senate is now proposing a new ranking system for European universities. It is not happy with the attention given to the Shanghai Jiao Tong index where French institutions do not perform very well.
The report on the euractiv site continues:
"France's key bone of contention with the Shanghai index is that the number of citations of a institution's scientific research is used as a ranking factor. Paris says this works against countries that do not publish in English.
Senator Joël Bourdin, the rapporteur on the report, argues that the intrinsic value of the Shanghai ranking is highly questionable ["très discutable"] and owes its interest only to its "mobilising effect".
The highest-ranking French university in the 2006 Shanghai ranking was Paris VI in 39th place, while American universities occupy more than half of the top 100. The only European universities in the top 10 are the UK's Universities of Cambridge and Oxford.
While French Higher Education Minister Valerie Pécresse has said she wishes to use the French EU Presidency to "lay down the foundations of common European criteria" for university classification, Bourdin argues that France should already go ahead and develop its own national classification system, as an EU-level process would take years. "
One wonders why the French do not simply use the citations per faculty section of the THE-QS rankings which shows the Ecole Normale Superieure in 4th place just ahead of the University of Alabama (5th), Pohang University of Science and Technology (11th), University of Helsinki (17th), Hong Kong University of Science and Technology (20th). On this measure the ecole does better than Princeton, Harvard, Cornell and Columbia.
The report can viewed here.
A report in the Independent (UK) provides further evidence of the destructive effects of the obsession with rankings.
"The university degree classification system is "descending into farce", the chairman of the Commons Select Committee on Universities has said.
Phil Willis was speaking as MPs questioned Peter Williams, the chief executive of the Quality Assurance Agency (QAA), the higher education watchdog, on degree standards. "An individual institution can award as many firsts as it wants, provided it satisfies its own criteria on what is a first," Mr Willis said.
It followed comments from Professor Geoffrey Alderman, the former head of quality at the University of London, which were reported in The Independent, that lecturers had been told to "mark softly" to ensure enough first-class degree passes were awarded to win a high ranking in league tables. He also alleged universities were turning a "blind eye" to plagiarism by international students because they were dependent on income from their fees. "
Wednesday, July 16, 2008
This is in reply to comments by Ben Sowter, head of research at QS, on my post of July 10th.
First of all, I was mistaken about the BlackBerry. At the end of the survey there is a box to tick if respondents want a chance to win one.
Next, I am glad to learn that Ben is a frequent reader of this blog. Thank you also for the comment that "Much of your analysis is detailed and shrewd. In fact, whilst I have declined to comment in the past we have kept an eye on your blog and, in some areas, chosen to make improvements based on your observations".
The clarification that the current survey is of prior respondents only and that later e-mails will also be sent to World Scientific subscribers is also welcome.
As for wondering why Malaysian universities were not on the first list, the introductory messages says that "You'll notice some slight differences to the survey this year - mainly in that we have included a lot more universities - as a result there are two questions about universities - one asking about those around the world, and a second asking about your own country specifically". I assumed that "around the world" meant that every country in the world would be included. The information should have read "around the world excluding your your own country". Still, I accept that I jumped to an unwarranted conclusion.
I think that there are serious questions remaining about the implications of this change. If QS are going to use only the responses to the first list when calculating the score for the academic opinion criterion then this would surely mean a noticeable reduction in the number of votes for universities that got a large number of responses from other places in that country. I suspect that this would have a disproportionate effect on American and Japanese universities. Wouldn't it also have a relatively beneficial impact on British universities with lots of graduates in the Commonwealth and Australia with graduates in South East Asia and China, countries with a lot of names in the World Scientific database?
Ben's comment that the purpose of having two lists is so that "we can do some more analysis on response dynamics and, potentially, more effectively counteract country biases" suggests that this might be the case since the implication is that the second list will not be used to construct the scores for the "peer review".
Is it possible that such a change would give Cambridge, Oxford and Imperial College London a further advantage over Harvard and push one of the former three into the number one spot?
I recently posted on what I thought was the removal of Malaysian universities from the THES-QS survey of academic opinion.
It appears, as comments on the post and my own fiddling about with the survey indicate, that what has happened is that the survey is divided into two parts: the first in which all universities in the country where the respondent works are excluded and a second in which only those universities -- minus the one where the respondent works -- are presented.
It seems that I have been unfair to QS on this occasion. There are, however, some issues about the survey that I will discuss in a little while.
Two comments refer to the absence on the State University of New York from the list. In fact, Stony Brook is there but not the other three campuses at Binghamton, Albany and Buffalo.
Thursday, July 10, 2008
How to get into the Top 50
Malaysia has declared that it wants to get a couple of local universities into the world's top 50. This obviously means the THES-QS and not the Shanghai Jiao Tong index.
So what should a Malaysian university do to get into the 50 on the THES-QS ranking? I assume that the likeliest candidate is Universiti Malaya (UM) and that it should try to equal the score of the University of Auckland, which was 50th in last year's index.
In 2007, Auckland got a total score of 77.5 made up of 95 for the "peer review", 83 for the employer review, 38 for student faculty ratio, 61 for citations per faculty, 100 for international faculty, and 99 for international students.
UM got a total of 49.4 made up 66 for the "peer review", 66 for the employer review, 38 for the student faculty ratio, 14 for citations per faculty, 63 for international faculty and 41 for international students.
So what does UM have to do to get a score equal to Auckland's?
- First of all , it should make sure that UM is actually included in the survey of academic opinion (see previous post). If it is not, then UM will probably not even make it into the top 500. Then it has to improve its score on this criterion by about half (the use of Z scores means that we can't be more exact than this) .
- Then the score on the employer review would have to increase by about a third.
- No need to worry about the student faculty ratio. UM is as good as Auckland.
- The number of citations per faculty would have to improve four-fold.
- The proportion of international students would have to increase by two thirds.
- The proportion of international students would have to more than double.
I doubt that it is worth trying to do better on the "peer review" and the "employer review" since these are opaque and biased. Increasing the number of international faculty and students would probably cause more trouble than its worth without strict controls over quality.
Could the citations per faculty be improved? This is not totally impossible. Over ten years Cardiff University managed to triple total research output. According to an interview with the Star, noted in an earlier post, this is how it was done.
To encourage productivity, Prof Smith switched the promotion system from a quota-based system (where the total number of professorial positions in a faculty were pre-determined) to a performance-based one.
He even offered an attractive retirement package to faculty members who were not producing much research.
However, in order for universities to be able to do that, Prof Smith said they need autonomy.
“The university has to be free to offer different contracts (to academics and scientists).
“And within the university, a lot of power needs to be devolved to the young people.
“It's all about having decisions taken at the lowest level practicable.
“That’s a major change,” he said.
It's that time of year again. A few days ago I received an invitation to take part in the THES-QS "peer review" Last year it came via World Scientific, the Singapore-based publishing company whose subscription list has been used by QS to construct their list of "smart people " who pass judgement on the world's universities. This year it came directly from QS. I do not know whether this means that QS is now using the THES subscription list as its database. If so, we can expect to see some wild fluctuations in the "peer review" and hence the overall rankings in October.
Anyway, here is message from QS.
It's that time of year again. Each year the response to the academic peer review for the Times Higher - QS World University Rankings goes from strength to strength.
QS and Times Higher Education are committed to making these rankings as strong and robust as they can be. Many enhancements have been made to the system in the last 12 months (you can read about them on http://www.topuniversities.com/) but amongst the most important has been your help in increasing the response to our academic peer review questionnaire.
Put simply... your opinion counts. Please share it with us.
You'll notice some slight differences to the survey this year - mainly in that we have included a lot more universities - as a result there are two questions about universities - one asking about those around the world, and a second asking about your own country specifically.
Please be as accurate and honest as possible. Help us make sure that your university contributes a representative response to the survey this year.
http://research.qsnetwork.com/academic
The deadline for response is July 15th.
At the end of the survey, from a selection of offers, you will have the chance to either...
Opt for a $100 discount on delegate's fee for QS APPLE (Asia Pacific Professional Leaders in Education Conference & Exhibition) 2009 in Kuala Lumpur
Enter a draw to win your university a free exhibition table at the QS World Grad School Tour (the world's leading circuit of Masters & PhD recruitement fairs) in a city of your choice
Receive a month month trial subscription to Times Higher Education
Thank you for taking the time to share your opinions with us, and please look out for the results of the Times Higher - QS World University Rankings 2008 - due to be published on October 9th.
Many thanks,
Ben SowterHead of ResearchQS
I was disappointed that I would not have a chance of getting a BlackBerry this time around. I started to fill out the form but stopped when I noticed something very odd. There are no Malaysian universities listed this year. Possible explanations are:
A. For some reason, QS have decided that no Malaysian university is of sufficient quality to even be included in the survey. This is unlikely considering some of the others that are included.
B. QS state at the start of the survey that the respondent's own university will be excluded from consideration. Since I work at a Malaysian university it is to be expected that that particular university would not show up. Perhaps some sort of error has meant that all Malaysian universities have been excluded from the list presented to me.
C. QS have made a principled decision that respondents are not allowed to choose any university in the country in which they work. This would be a good idea and therefore can probably be ruled out straightaway.
D. QS just forgot about Malaysia.
E. A computer error that affected me and nobody else.
Based on past experience, D seems the most likely, followed by B. If D, then in October Malaysian universities are going to get zero on this year's "peer review" and therefore will fall even further in the rankings. There will no doubt be an mass outbreak of soul searching in Malaysian universities and jeering by the opposition. If B, The fall will not be so great but Malaysian universities would still suffer if they are the only ones who cannot receive votes from within the country.
I appeal to any reader of this blog who has completed or is about to complete the THES-QS survey to let me know whether they have also noticed the omission of Malaysian universities or of any other country.