Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Monday, September 17, 2007
The recent Forbes ranking of business schools has the Broad School of Business at Michigan State University at number 19 in the US but does not mention the University of Michigan in the top 100.
The Economist has the Stephen M. Ross School at the University of Michigan in 9th place for the US but does not mention Michigan State University in the world's top 100.
Is it possible that somebody has got the two Michigan schools mixed up?
I notice that the Financial Times Global MBA rankings has University of Michigan in 19th place and Michigan State University in 38th. Could it be that both Forbes and the Economist have made a mistake?
This one is from the Economist. The top 10 US schools are
1. Dartmouth
2. Stanford
3. Chicago
4. Northwestern
5. Harvard
6. New York University
7. University of Michigan
8. Berkeley
9. Columbia
10. Virginia
The top 5 outside the US are:
1. IESE (Spain)
2. IMD (Switzerland)
3. Cambridge
4. Henley
5. IE (Spain)
There are some noticeable differences between these and the Forbes rankings. Forbes puts the University of Pennsylvania in 5th place for the US but the Economist has it in 12th. The Marriott School of Business at Brigham Young University is 18th in the US in Forbes but does not appear in the Economist.
It is possible that this is because the Economist rankings measure a much broader range of criteria including, for example, the number of staff with Ph Ds and the number of foreign students.
Forbes magazine (print edition 3/9//07) has another ranking of business schools. The top ten US schools, meaning those providing the greatest financial benefit to their graduates, are:
1. Dartmouth
2. Stanford
3. Harvard
4. Virginia
5. Pennsylvania
6. Columbia
7. Chicago
8. Yale
9. Northwestern
10. Cornell
The top 5 outside the US are:
1. IESE (Spain)
2. London
3. Manchester
4. York (Canada)
5. Ipade (Mexico)
Warning: the ranking is based on a survey of alumni of MBA programmes. Surveys were sent to 18,500 alumni of 102 programmes and there was a response rate of 22%. This means that the ranking is based on about 4,070 replies or about 40 per school. It is unlikely that the response rate was constant across programmes so for smaller schools the number of responses may be quite low. This raises questions about the validity of the results.
Wednesday, September 12, 2007
Last April Peter Sacks launched an attack on the US News and World Report college rankings. There is lot with which I would agree with him about, especially the role of the survey of senior administrators' opinions ("beauty contest") about their peers.
I'm not a statistician, but it hardly requires a degree in econometrics to determine that graduation rates, student-faculty ratios, acceptance rates, alumni giving rates, and all the factors in the U.S. News methodology are profoundly correlated to the institution's selectivity -- how many freshman the institution accepts for admission relative to the number who apply. And none of these factors is related to selectivity more than freshmen SAT scores. In the U.S. News worldview of college quality, it matters not a bit what students actually learn on campus, or how a college actually contributes to the intellectual, ethical and personal growth of students while on campus, or how that institution contributes to the public good. College quality in the U.S. News paradigm boils down to the supposed quality of freshmen the day they pass through the ivory gates -- long before they write a single college essay or solve a physics problem.
Sacks then goes on to praise a young woman, a daughter of affluent Harvard-educated parents who has renounced expensive tutoring for the SAT since she feels that with all her advantages she does not deserve any extra help.
Leaders of these institutions ought to take a lesson from one young Massachusetts woman named Esther Mobley. Attending a top-notch high school in an affluent suburb of Boston where parents buying high-priced SAT tutoring for their kids is like death and taxes, Mobley opted out, declining to take an SAT prep course. According to the New York Times, she did so on the simple principle that kids like her, growing up with Harvard-educated parents and every educational advantage, don't need or deserve such extra help.The problem is that if the SAT is not used to admit students to elite institutions, then what will? Esther Mobley did not give up her advanced placement Latin, did not restrain herself from talking about the subjunctive in Catullus or Kierkegaard's existential choices, did not stop acting, did not resign as president of the church youth group or ask her parents to transfer her to a school with less competitive classmates and less motivated and qualified teachers. She eventually got into Smith College.
In the end, isn't this all more dependent on parental wealth, education and status than the qualities measured by the SAT. Surely an SAT-free admission policy will favour the children of the affluent and educated much more than current practices.
The Keck School of Medicine at the University of Southern California has appointed a new Dean. What are his qualifications? What is he going to do?
Get the school a better place in the rankings. What else does the Dean of a medical School do?
Dr. Carmen A. Puliafito, the current chair of the University of Miami's Department of Ophthalmology, will replace Dr. Brian Henderson, whose three-year contract as interim dean of Keck expired this year.
University officials said they expect the new dean's leadership to help vault Keck into a higher ranking among medical schools.
"Clearly, what we aim for with Dr. Puliafito's leadership is to move up the Keck School of Medicine in the rankings," said Provost C.L. Max Nikias, who chaired the faculty search advisory committee that selected Puliafito. "We want it to be up there in the very top tier of medical schools in the country."
Auburn University has appointed the CEO of a management development company to teach courses for professional engineers. This is supposed to push the university higher in the USN&WR rankings, although the university does not say exactly how this will happen.
The school said it expects Brewer's expertise to push it higher in national rankings, placing it among the country's elite engineering programs. The Auburn engineering program was ranked No. 40 in the nation among major universities, according to U.S. News and World Report.
Monday, September 10, 2007
Around the world over the last few decades there has been a shift from selection for secondary and tertiary education by tests of cognitive ability or academic achievement to selection based on "holistic" methods of assessment.
American students in particular, but increasingly many others, are risking mental and physical breakdown as they compete in sports (more than one seems to be required now), volunteer, debate, find interesting part-time jobs, travel abroad, volunteer, suffer a trauma or enjoy an epiphany that has some sort of socio-cultural significance and spend hours agonising over producing an application essay that will attract the attention of an admission officer who has read thousands of such things over the years.
One of the reasons for minimising the role of standardised tests like the SAT and GRE in the admission process is that the coachability of such tests creates an unfair disadvantage for the upper and middle classes who can afford tutors, cramming courses and coaches.
An issue that does not seem to have been raised very much is the comparative cost of coaching for the SATs, for example, and preparing a profile that will impress an Ivy League admissions officer.
How much coaching will it take for someone of average intelligence to get a perfect score on the SAT or, if British, a bunch of grade As at A-level? How much would it cost to equip him or her with the background to impress an American admissions officer.
A natural experience was done with Prince Harry recently. The most expensive school in England couldn't get him any better than a B in art and a D in geography at A-levels. But, apart from his academic performance, he would have all the other requirements for university education, volunteering, leadership skills, foreign travel, sports and so. He could probably even, with guidance, produce a poignant essay about how his life had been diminished by a lack of cultural diversity.
To put it simply, you can, if you are wealthy and obsessed enough, buy perfection or something close to it in the non-academic criteria but you cannot buy anything more than a modest improvement in the SATs, GRE or A- levels (at least in traditional academic subjects).
So why is the obsession with holistic assessment and well-rounded students considered to be progressive?
Things reach the the point of absurdity when middle-class American students have become so indistinguishably perfect that they are advised to introduce a minor but detectable flaw into their university applications. A story posted on mywire recounts how:
"Steven Roy Goodman, an independent college counselor, tells clients to make a small mistake somewhere in their application — on purpose.
"Sometimes it's a typo," he says. "I don't want my students to sound like robots. It's pretty easy to fall into that trap of trying to do everything perfectly and there's no spark left."
What Goodman is going for is "authenticity" — an increasingly hot selling point in college admissions as a new year rolls around.
In an age when applicants all seem to have volunteered, played sports and traveled abroad, colleges are wary of slick packaging. They're drawn to high grades and test scores, of course, but also to humility and to students who really got something out of their experiences, not just those trying to impress colleges with their resume. "
And what happens when everybody has achieved perfection and everybody (or at least those who can afford the fees changed by the likes of Dr Goodman) has inserted that spark-revealing error? Two errors?
Saturday, September 08, 2007
An essential element in maintaining academic standards is the freedom to research, teach and engage in public discussion. One case where there seems to have been an erosion of such freedoms involves Norman Finkelstein, until recently an assistant professor at DePaul University in Chicago.
This post is a bit late now that Finkelstein and DePaul have reached an agreement that he will resign while the university has issued a statement praising his performance as a teacher and a scholar.
If anyone is not aware, Finkelstein has written extensively on the historiography of the Holocaust, among other matters, presenting a highly controversial view. He believes, to simplify things very drastically, that the Holocaust has been used as an excuse to justify oppressive Israeli policies in Gaza and the West Bank..
Finkelstein was recently denied tenure at De Paul apparently on the grounds that he showed insufficient respect for the opinions of others. This may be true but it should not be an issue and Finkelstein should not have been denied tenure for such a reason.
I had better make it clear that there are many things with which I disagree with him about. He is also strident, rather too prone to self -pity and very rude to middle-age ladies (he calls Deborah Lipstadt , a Professor at Emory University, the Elsie the Cow Chair in Judeo-yenta Studies).
That being said universities ought not to give or withhold promotion or appointment on such grounds. There seem to have been no complaints about his teaching and his research does not, as far as I know, show any signs of plagiarism or data fabrication. Maybe he draws the wrong conclusions from the data but that should not be at issue.
It appears that one factor in the denial of tenure was Finkelstein's argument with Alan Dershowitz a professor at Harvard. There is no time to go into details here but if DePaul was swayed by pressure from Dershowitz this would be highly inappropriate.
Returning to the question of double standards that was raised by the Southern Illinois plagiarism case, we note that Dershowitz has shown no respect for the opinions of Finkelstein either but I have heard of no suggestion that his career might be jeopardised. Neither has Deoborah Lipstadt been threatened with sanctions for calling Finkelstein "the dirt you step in on the street".
I have already noted the various plagiarism controversies at Southern Illinois University . The case is perhaps most noticeable for the double standards applied to Chris Dussold, an instructor who copied a colleague's teaching philosophy statement, and the University president Glenn [Glendal on the dissertation] Poshard who has been accused of plagiarising his 1984 Ph D dissertation on the education of gifted children in Illinois.
Dussold was sacked but Poshard had asked the department that awarded the degree to review the dissertation and indicate any changes that needed to be made. The department, very much to its credit, has declined to do so.
"Under attack for allegedly plagiarizing parts of his 1984 thesis, Poshard last week asked the university's department of higher education and administration, which awarded his PhD, to review the work and recommend action. As president, Poshard now oversees that department.
"The department has concluded that a committee with broader academic representation would be more appropriate for this review," SIU spokesman Dave Gross said in a statement, noting that the decision leaves the review process unresolved."
I would like to suggest that Southern Illinois adopt the policy that I believe, with variations, is used in many universities for undergraduate writing. First case of plagiarism -- write the paper again on another topic. Second case, zero for the course. Third time out.
So Dussold should write his teaching philosophy statement again and Poshard can write another dissertation. It shouldn't take very long. At 111 pages including a lot of tables and, in the first quarter at least, a lot of white space it is not exactly The Decline and Fall of the Roman Empire.
Perhaps he could write about something like Promoting Academic Honesty in a Public University.
The obsession with university rankings is now spreading in Africa. Makerere University in Uganda is concerned about its low position in the rankings (evidently those produced by Webometrics since it does not have any sort of position in the others). It is therefore imposing a levy of 50-80,000 Ugandan shillings a year on students (not, it seems, on faculty or administrators) to improve ICT at the university and therefore boost Makerere's position in the rankings.
"MAKERERE University will charge an ICT fee for both private and government-sponsored students.
Each undergraduate will have to pay sh50,000 and post-graduates will pay sh80,000 per academic year.
|
The decision was taken in a meeting held last week, according to the university senior public relations officer, Gilbert Kadilo.
He said the money would be used to boost the university's Information and Communication Technology (ICT) programme.
"The university is moving towards internationalising its academic programmes and this requires investment in ICT," Kadilo explained yesterday.
He added that the policy was aimed at facilitating the publication of university research on the internet so as to improve the international ranking of the university."
Even if this works, one wonders whether it would be worth it. Wouldn't a fee to buy books and journals for the university library be a better idea?Monday, September 03, 2007
The search for status has its costs. The rush to climb the rankings ladder has produced many distortions in university policies and practices throughout the world and may even be a significant contributor to the current plague of academic dishonesty.
There has been a spate of plagiarism accusations at Southern Illinois University (SIU). One involves Chris Dussold who apparently copied his two page teaching philosophy statement from a colleague and was sacked. Such statements are seen by many as a waste of time and are usually totally insincere. Committing plagiarism in producing such things is on about the same level as submitting an unoriginal letter of application or admissions essay. The Dussold case is still sub judice so all I can say is that he has my sympathy. He should, I suppose, have been told to go and write the statement again but dismissal seems far too draconian a step.
The President of the university, Glenn (why was it Glendal on his dissertation?) Poshard, has been accused of plagiarising his doctoral dissertation on the education of gifted children, which was submitted to SIU's Carbondale campus's Department of Educational Administration and Higher Education, from which he received his Ph D. According to the Chicago Sun-Times:
'He will ask the department -- now under his command as president of the SIU system -- to review the document "and to advise me on corrections necessary to make this dissertation consistent with the highest academic standards.
"I will make whatever changes are recommended by the department, and by doing so I hope to fulfill the highest expectations that you have of me as your president," the former congressman said.
The allegations first surfaced Thursday in the student newspaper, the Daily Egyptian, which said it found 30 sections of Poshard's paper that contained verbatim text from other sources that either wasn't placed in quotation marks or wasn't cited properly.
Poshard said at a news conference Friday it's possible he made some mistakes in the 111-page paper, but they were "unintentional." Nevertheless, "they need to be promptly acknowledged and remedied," he said.'
Note that the department has not been asked to consider whether Poshard should suffer the same fate as Dussold whose offense was surely much less.There was another case of plagiarism at SIU, one which seems related to the current fashion for getting as high up the rankings as possible.. In 2001 SIU produced a plan called Southern at 150. The idea was for SIU to get into the top 75 public universities in the US (presumably as ranked by the US News and World Report) by its 150th anniversary. It seems that much of the plan was very similar to one called Vision 2020 published by Texas A and M University at College Station in the late 1990s as part of its drive to be a top ten university by 2020.
Something that nobody seems to have noticed is that the idea of calling a strategic plan "Vision 2020" was apparently not original to anyone in Texas. It has been used to describe long-term plans by the cities of Bakersfield and Hamilton and the governments of India and Trinidad among others. It seems to have been first used by the Malaysian prime minister, Dr Mahathir Mohamed, in February 1991 to refer to his country's aspirations to economic and social development.
I doubt that anyone could be disciplined for plagiarising two words but still an acknowledgement of Dr Mahathirs's prior use of the phase would have been polite.
Maybe everyone at Texas A and M will say that the idea of "Vision 2020" was completely original, perhaps occurring after a visit to the optician's, and that this is simply a coincidence. This is possible but in some ways more disturbing. Malaysia in the 1990s was one of the world's fastest growing economies and it says a lot about the parochialism of the army of Texan experts if none of them had ever come across a reference to Malaysia's strategic plans.
Incidentally, there is a reference to the local governments around Puget Sound off Washington State adopting a Vision 2020 in October 1990 but but it is not clear that this was published at the time and it certainly never got the attention of Dr Mahathir's statement.
Thursday, August 30, 2007
Steve Sailer has a post that compares the Graduate Record Exam results of candidates by intended field of study and calculates their mean IQs. There may be some methodological leaps here but the results are interesting. Here is a selection:
Physics & astronomy 133
Mathematical sciences 130
Philosophy 129
Economics 128
Engineering 126
Chemistry 124
English language & lit 120
History 119
Sociology 114
Business 114
Business admin & mgmt 111
Student Counseling 105
Early Childhood education 104
Social Work 103
Psychology 112
Is it possible the number of students and faculty in various disciplines might be a crude but useful proxy for the overall intelligence of staff and students in specific universities?
Also, I can't help but wonder whether QS's succession of errors (counting ethnic minorities in Malaysia as international faculty and students, getting hopelessly mixed up over Duke's student faculty ratio and so on) compared with Shanghai Jiao Tong University's relatively blemish-free rankings has something to do with the former being led by a couple of MBAs and the latter by someone with degrees in chemistry and engineering.
Nunzio Quacquarelli, director of QS Quacquarelli Symonds (QS), has some advice for applicants to business school. He points out that rankings of business schools are controversial and that many believe they are badly flawed:
"For the last decade, the management education sector has been obsessed with the ranking of business schools. Publishers such as the Financial Times, Business Week and the Wall Street Journal sponsor regular surveys that stoke interest to the point that their coverage produces some of their top selling editions. There is now a growing controversy about whether these rankings provide useful information for MBA applicants, or are misleading and creating a 'herd instinct' towards a few schools, which is of no benefit to anyone.
Business school officials differ in their views of rankings. Although The Wharton School frequently tops rankings, Dean Pat Harker feels, "There is a very strong consensus among parties (alumni, faculty and staff of other institutions), that the ranking methodologies are severely flawed... Some people believe that if the rankings help us, who cares if they are flawed or give a limited view of the school? But we can't have it both ways. We either endorse a defective, inconsistent practice, or we speak out, offer better alternatives for information, and work with the media to enable them to report with more useful, objective data." "
After reviewing various rankings, Quacquarelli looks at QS's own MBA scorecard which allows readers to create their own rankings by changing the weighting of the various components. Unsurprisingly, he thinks highly of the scorecard:
"Rachel Tufft, Marketing Director at Manchester Business School, feels, "Scorecard is the most in-depth and interactive information tool available for MBA applicants today." The MBA Director at Cranfield adds, "TopMBA.com Recruiter Research adds a great deal of value because it is a clear statement from the marketplace about the popularity of international MBA programmes with recruiters. It also gives us a clear indication of what we need to do to improve. Any improvements we make to enhance our visibility amongst recruiters will be of direct benefit to our students - another sign of useful research." "
Quacquarelli does not mention that earlier this year QS provided the data for Fortune's 2007 "50 Best B-Schools for Getting Hired." The University of North Carolina at Chapel Hill's Kenan-Flagler Business School was outraged at not being included and was shocked by "the shoddy, inaccurate and inappropriate research methods employed in the Ranking of Top 50 Business Schools."
Kenan-Flagler has listed QS's errors in detail:
" QS has admitted that they did not contact us for this ranking. They admitted that they used data, often out-of-date information, collected for another purpose. They explained our exclusion by saying that they confused our business school with another North Carolina school (NC State).
Every major ranking organization notifies schools of impending rankings and requests data as input. QS did not. Virtually all data-collection organizations have verification and validation procedures. QS did not. Every publication announcing rankings would at least cross check that major schools from existing, established rankings were included. QS did not.
We and other schools have already uncovered multiple serious issues in data collection and analysis. The salary figures for our and other MBA Programs are outdated or wrong. Some data come from 2004, some from 2005, and some schools have reported the numbers don’t match their data for any year, even though QS contends that the data are all from 2006. If we were to use accurate Kenan-Flagler salary data alone, we would expect to be in the top 15 schools. To document, the average Kenan-Flagler base salary for the Class of 2006 was $89,494. The average signing bonus was $22,971. The number of students employed at 90 days post-graduation was 91.5%. "
Eventually, Fortune removed the rankings from their website.
Quacquarelli concludes his article with a warning:
"Users need to delve into each ranking and identify the elements that can provide useful information or insight into schools that may interest them."
Indeed they do.
Monday, August 27, 2007
A settlement has been reached with regard to the suit brought by test takers who received incorrect scores for the October 2005 SAT. The College Board, who owns the test, and NSC Pearson, who scores it, have agreed to pay 2.85 million dollars to about 4,000 people.
"A tentative settlement was announced Friday by the two testing entities and lawyers who filed the class action. About 4,400 people — or about 1 percent of those who took the test that month — are in the class because their scores were reported incorrectly. Under the planned settlement, they will have two options. They can fill out a short form to automatically receive $275, or they can provide more information — if they believe that their damages were greater — and a retired judge will make binding decisions on how much they are entitled to receive."
The full story is here.
This is certainly embarrassing for the testers and will no doubt be used as ammunition by opponents of standardised testing and its use in university admissions and assessment. But one wonders how many more people have suffered from the gender, race and class bias of interviews. And are we ever going to see THES or QS acknowledge or apologise for any of their errors?
Saturday, August 25, 2007
I have been a fan of Laurie Taylor's s satirical column at the Times Higher Education Supplement (subscription required) for a long time. The current issue has an amusing piece about Poppleton University's criticism of university rankings:
"Here at Poppleton, we strongly support this move. For although we were obviously gratified by our appearance at No 2 in the recently compiled Poppleton Evening News league table, our relatively lower positioning in the tables compiled by other newspapers is, we believe, the result of just such bias. Not one of these tables, for example, includes any of the following distinctive Poppleton features:
Size of Human Resources Department
Statistics show that Poppleton has more people involved in managing other people than any other university of comparable size in this country. "
Universities and colleges that complain about the failure of rankings to acknowledge their unique qualities, which somehow are not only unquantifiable but also inexpressible, deserve to be mocked. But they are a rather easy target. Will Laurie Taylor ever have a go at the THES-QS rankings?
Tuesday, August 21, 2007
A number of American liberal arts colleges have refused to contribute to the reputational survey of the US News and World Report rankings. See insidehighered for the full story. This is a highly positive development since such surveys tend to be biased, self-confirming and opaque. The THES-QS "peer review" is perhaps the worst of the lot in these respects but other reputational surveys are probably little better.
Less positive is the news about Sarah Lawrence, a New York liberal arts college. This school no longer looks at the SAT scores of its applicants and therefore has been placed in the "unranked" category by USNWR, which counts SAT scores as a key indicator of student quality. There has been a fair bit of controversy about this but I doubt that Sarah Lawrence will suffer very much. The publicity will probably compensate for losing its place among the top liberal arts colleges.
Sarah Lawrence's action is, however, potentially very dangerous. The SAT is essentially an intelligence test and therefore is highly predictive of academic success and resistant to coaching. There is, it is true, a small scale industry devoted to boosting SAT scores but its claims are grossly exaggerated.
What will likely happen is that admission to Sarah Lawrence will be based on the evaluation of high school essays and performance in class including advanced placement courses and recommendations from teachers and counselors topped up with an array of interesting extra-curricular activities. It is more than likely that the admissions process will give an advantage to those whose parents can move to suburbs with good schools, provide a glut of stimulating activities that will be raw material for essays and provide advice, assistance, Internet access and transport for high school projects.
In short, ultimately admission to Sarah Lawrence -- and no doubt many other colleges eventually -- will be based on the ability to impress high school teachers and administrators and to have an interesting out of school life. In the end this is all far more dependent on parental wealth than an intelligence test.
If Sarah Lawrence's stand became widespread -- and it probably will -- then admission to many highly valued American colleges will be determined not by cognitive ability but by the social and communicative skills that can only be acquired by long and expensive socialisation.
What is baffling about this is that, as with the abolition of the 11-plus in Britain, such a development is led by intelligent and educated people people who surely must have gained enormously by the spread of standardised testing over the last century.
According to the Princeton Review they are:
1. Whitman College (Walla Walla, Washington)
2. Brown
3. Clemson university (South Carolina)
4. Princeton
5. Stanford
6. Tulsa
7. College of new Jerseay
8. Bowdoin (Maine)
9. Yale
10. Thomas Aquinas College (California)
I am sure that PR's methods could be argued about but it is striking that four of the universities on this list are also in the top ten of selective universities.
The Princeton Review has come out with a variety of rankings. One of them ranks US colleges by how hard they are to get into, which many would think is a good proxy for general quality. Here are the top ten.
1. Harvard
2. Princeton
3. MIT
4. Yale
5. Stanford
6. Brown
7. Columbia
8. Pennsylvania
9. Washington in St Louis
10. Caltech
Monday, August 20, 2007
The US News and World Report rankings of American colleges is out. A full report is here.
Briefly this is how they are produced:
"To rank colleges and universities, U.S. News first assigns schools to a group of their peers, based on the basic categories developed by the Carnegie Foundation for the Advancement of Teaching in 2006. Those in the National Universities group are the 262 American universities (164 public and 98 private) that offer a wide range of undergraduate majors as well as master's and doctoral degrees; many strongly emphasize research.
In each category, data on up to 15 indicators of academic quality are gathered from each school and tabulated. Schools are ranked within categories by their total weighted score."
The top ten are:1. Princeton
2. Harvard
3. Yale
4. Stanford
5. Pennsylvania
6. Caltech
7. MIT
8. Duke
9. Columbia
9. Chicago
Sunday, August 19, 2007
The full rankings, released earlier this year, are here.
Below are the top twenty.
1. Harvard
2. Berkeley
3. Princeton
4. Cambridge
5. Caltech
6. MIT
7. Stanford
8. Tokyo
9. UCLA
10. Oxford
11.Cornell
12. Columbia
13. Chicago
14. Colorado--Boulder
15. ETH Zurich
16. Kyoto
17. Wisconsin -- Madison
18. UC Santa Barbara
19. UC San Diego
20.Illinois -- Urbana Champaign
Friday, August 17, 2007
Shanghai Jiao Tong University has also released, earlier this year, research rankings in broad subject areas. First here are the top twenty for the social sciences. The full ranking is here. These rankings should be taken with a fair bit of salt since they are heavily biased towards economics and business studies. Credit is given for Nobel prizes although these are only awarded for economics while psychology and psychiatry are excluded. There are two categories of highly cited researchers, Social Sciences -- general and Economics/Business.
It would, I think, be better to refer to this as an economics and business ranking.
1. Harvard
2. Chicago
3. Stanford
4. Columbia
5. Berkeley
6. MIT
7. Princeton
8. Pennsylvania
9. Yale
10.Michigan -- Ann Arbor
11. New York Univ
12. Minnesota -- Twin Cities
13. Carnegie-Mellon
14. UCLA
15. Northwestern
16. Cambridge
17. Duke
18. Maryland -- College Park
19. Texas -- Austin
19 . Wisconsin-- Madison
Note that there is only one non-US university in the top twenty, Cambridge at number 16. The best Asian university is the Hebrew University in Jerusalem at 40. The best Australian university is ANU at 77-104 . There is no mainland Chinese university in the top 100. This is dramatically different from the picture shown by the THES peer review in 2006.
Thursday, August 16, 2007
The London Times (not the Times Higher Education Supplement) has just produced its Good University Guide for British Universities. It is based on eight criteria: student satisfaction, research quality, student staff ratio, services and facilities spend, entry standards, completion, good honours and graduate prospects.
Here are the top ten.
1. Oxford
2. Cambridge
3. Imperial College London
4. London School of Economics
5. St Andrews
6. University College London
7. Warwick
8. Bristol
9. Durham
10. Kings College London
I shall try and comment later but for the moment it's worth pointing out that there are some spectacular rises, Kings College, Exeter and City University. This immediately raises questions of the stability of the methods and the validity of the data.
Wednesday, August 15, 2007
Shanghai Jiaotong University has just released its 2007 Academic Ranking of World universities. The top 100 can be found here. The top 500 are here.
I shall add a few comments in a day or so. Meanwhile here are the top 20.
1 Harvard Univ
2 Stanford Univ
3 Univ California - Berkeley
4 Univ Cambridge
5 Massachusetts Inst Tech (MIT)
6 California Inst Tech
7 Columbia Univ
8 Princeton Univ
9 Univ Chicago
10 Univ Oxford
11 Yale Univ
12 Cornell Univ
13 Univ California - Los Angeles
14 Univ California - San Diego
15 Univ Pennsylvania
16 Univ Washington - Seattle
17 Univ Wisconsin - Madison
18 Univ California - San Francisco
19 Johns Hopkins Univ
20 Tokyo Univ
Tuesday, August 14, 2007
There is a web site, College Ranking Service, that is produced by "a non-profit organization dedicated to providing rankings of colleges in a manner suitable for students, university leaders, and tuition paying parents."
The home page says:
"We take our rankings seriously. Each college is painstakingly analyzed, as if under a microscope, for its flaws and degree of polish. The rankings found on www.rankyourcollege.com represent thousands of hours of research, and are updated annually or at the discretion of the Director.
The Board of the College Ranking Service, composed of Nobel Prize Winners and Captains of Industry, remains anonymous to ensure the integrity of the rankings.
The Director is also anonymous, however, rest assured that he is a prominent member of the academy and a professor of the highest regard at one of the most prestigious universities in the world."
There is also a disclaimer:"There is no such thing as the "College Ranking Service." But the hyperbole and baloney contained in this web site are not that different from equally silly, but maddeningly serious college ranking publications and web sites offered by the media.
It is a sham and a scam to try to rank the quality of universities like sports franchises. Media publications that do this should be laughed out of existence. They simply measure wealth ("The Classic Method" on this web site), which is something that is at best obtusely related to quality.
Regardless of their lack of validity, media-based college rankings are having a negative influence on higher education. Tuition paying parents and their children are swayed by the false prestige these rankings imply. The push to get into a "top ten" school has created added pressure on students to stuff their high school years with lofty sounding, but often meaningless accomplishments. It has been partly responsible for the rise of a college application industry that provides services (like SAT prep classes and college application consulting) of dubious worth."
CRS also describes its methodology:
"In the course of developing our methodology, we found that our rankings had unique properties. First, we noted a phenomenon well known in particle physics, but unheard of heretofore in ranking systems: a college, like a subatomic particle, could be two or more places at once. In other words, individual colleges could have multiple rankings!
Second, we noted the well known and by now passe Heisenberg phenomenon in our rankings: our rankings were influenced by our evaluation. The more we looked at them in great detail, the more variability we saw. Finally, we found a butterfly effect: small perturbations in our extensive data base resulted in significant changes in our rankings.
The combined influences of these phenomena we term the Kanoeddel effect, in honor of the Director's mother's Passover matzah balls, which even though they were made at the same time, had a wide range in density (from that of cotton balls to that of granite pebbles). In Yiddish, the word for "matzah ball" is "kanoeddel."
Because of the Kanoeddel effect, we note that our rankings are not static. Hitting the refresh button on your web browser will cause the Mighty Max to recompute the rankings, resulting in a slightly different order."
In the Guide to the World's Top Universities, we find a perfect example of the first property with the Technical University of Munich occupying two different places in the rankings and also, in one case, being located in Dortmund.The butterfly effect is illustrated perfectly by the data entry or transfer error that led to an incorrect figure for student faculty ratio for every university in the Guide.
Sunday, August 12, 2007
This blog was originally supposed to be about university ranking in general but is danger of turning into a catalogue of THES and QS errors. I shall try to move on to more varied topics in the future but here is an elaboration of an earlier post on the faculty student ratios in the book Guide to the World's Top Universities, published by QS Quacquarelli Symonds Ltd. (QS) and written by Nunzio Quacquarelli, a director of that company, and John O'Leary and Martin Ince, former and current THES`editors.
In the first column below I have arranged the universities in the THES-QS rankings in alphabetical order. The middle column consists of the student faculty ratio included in the 2006 World University Rankings published in the THES (top 200) and on the topuniversities website. The figure is derived from converting the scores out of 100 in the rankings to ratios and cross-checking with QS's figures for faculty and students at topuniversities. The right-hand column contains the student faculty ratio in the Guide's directory and in the profile of the top 100 universities.
The two figures are completely different. But if you go down three rows you will find the figure is identical or almost identical.
Thus Aachen has a ratio of 14.7 in the Guide. Go down three rows and you will find that in the rankings and at topunivsersities Aberystwyth has a ratio of 14.7.
So, presumably what happened is that someone was pasting data between files and slipped three rows. This simple mistake has resulted in over 500 errors in the Guide.
University | ranking | Guide |
Aachen | 12.4 | 14.7 |
Aarhus | 10.5 | 24.1 |
Aberdeen | 10.5 | 14.8 |
Aberystwyth | 14.7 | 15.1 |
Adelaide | 24.9 | 18.1 |
Adolfo Ibanez | 14.8 | 20.5 |
Airlangga | 15.1 | 12.3 |
Alabama | 18.1 | 10.4 |
Alberta | 20.5 | 13.1 |
Amsterdam | 12.4 | 16.1 |
Antwerp | 10.4 | 24.3 |
Aoyama Guiken | 13.1 | 13.6 |
Aristotelian | 16.1 | 15.4 |
Arizona State | 24.3 | 14.3 |
I have written to Ben Sowter, director of research at QS and to the author's at Martin Ince's address. So far the only response is an automated message indicating that the latter is away.