Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, August 30, 2007
Nunzio Quacquarelli, director of QS Quacquarelli Symonds (QS), has some advice for applicants to business school. He points out that rankings of business schools are controversial and that many believe they are badly flawed:
"For the last decade, the management education sector has been obsessed with the ranking of business schools. Publishers such as the Financial Times, Business Week and the Wall Street Journal sponsor regular surveys that stoke interest to the point that their coverage produces some of their top selling editions. There is now a growing controversy about whether these rankings provide useful information for MBA applicants, or are misleading and creating a 'herd instinct' towards a few schools, which is of no benefit to anyone.
Business school officials differ in their views of rankings. Although The Wharton School frequently tops rankings, Dean Pat Harker feels, "There is a very strong consensus among parties (alumni, faculty and staff of other institutions), that the ranking methodologies are severely flawed... Some people believe that if the rankings help us, who cares if they are flawed or give a limited view of the school? But we can't have it both ways. We either endorse a defective, inconsistent practice, or we speak out, offer better alternatives for information, and work with the media to enable them to report with more useful, objective data." "
After reviewing various rankings, Quacquarelli looks at QS's own MBA scorecard which allows readers to create their own rankings by changing the weighting of the various components. Unsurprisingly, he thinks highly of the scorecard:
"Rachel Tufft, Marketing Director at Manchester Business School, feels, "Scorecard is the most in-depth and interactive information tool available for MBA applicants today." The MBA Director at Cranfield adds, "TopMBA.com Recruiter Research adds a great deal of value because it is a clear statement from the marketplace about the popularity of international MBA programmes with recruiters. It also gives us a clear indication of what we need to do to improve. Any improvements we make to enhance our visibility amongst recruiters will be of direct benefit to our students - another sign of useful research." "
Quacquarelli does not mention that earlier this year QS provided the data for Fortune's 2007 "50 Best B-Schools for Getting Hired." The University of North Carolina at Chapel Hill's Kenan-Flagler Business School was outraged at not being included and was shocked by "the shoddy, inaccurate and inappropriate research methods employed in the Ranking of Top 50 Business Schools."
Kenan-Flagler has listed QS's errors in detail:
" QS has admitted that they did not contact us for this ranking. They admitted that they used data, often out-of-date information, collected for another purpose. They explained our exclusion by saying that they confused our business school with another North Carolina school (NC State).
Every major ranking organization notifies schools of impending rankings and requests data as input. QS did not. Virtually all data-collection organizations have verification and validation procedures. QS did not. Every publication announcing rankings would at least cross check that major schools from existing, established rankings were included. QS did not.
We and other schools have already uncovered multiple serious issues in data collection and analysis. The salary figures for our and other MBA Programs are outdated or wrong. Some data come from 2004, some from 2005, and some schools have reported the numbers don’t match their data for any year, even though QS contends that the data are all from 2006. If we were to use accurate Kenan-Flagler salary data alone, we would expect to be in the top 15 schools. To document, the average Kenan-Flagler base salary for the Class of 2006 was $89,494. The average signing bonus was $22,971. The number of students employed at 90 days post-graduation was 91.5%. "
Eventually, Fortune removed the rankings from their website.
Quacquarelli concludes his article with a warning:
"Users need to delve into each ranking and identify the elements that can provide useful information or insight into schools that may interest them."
Indeed they do.
Monday, August 27, 2007
A settlement has been reached with regard to the suit brought by test takers who received incorrect scores for the October 2005 SAT. The College Board, who owns the test, and NSC Pearson, who scores it, have agreed to pay 2.85 million dollars to about 4,000 people.
"A tentative settlement was announced Friday by the two testing entities and lawyers who filed the class action. About 4,400 people — or about 1 percent of those who took the test that month — are in the class because their scores were reported incorrectly. Under the planned settlement, they will have two options. They can fill out a short form to automatically receive $275, or they can provide more information — if they believe that their damages were greater — and a retired judge will make binding decisions on how much they are entitled to receive."
The full story is here.
This is certainly embarrassing for the testers and will no doubt be used as ammunition by opponents of standardised testing and its use in university admissions and assessment. But one wonders how many more people have suffered from the gender, race and class bias of interviews. And are we ever going to see THES or QS acknowledge or apologise for any of their errors?
Saturday, August 25, 2007
I have been a fan of Laurie Taylor's s satirical column at the Times Higher Education Supplement (subscription required) for a long time. The current issue has an amusing piece about Poppleton University's criticism of university rankings:
"Here at Poppleton, we strongly support this move. For although we were obviously gratified by our appearance at No 2 in the recently compiled Poppleton Evening News league table, our relatively lower positioning in the tables compiled by other newspapers is, we believe, the result of just such bias. Not one of these tables, for example, includes any of the following distinctive Poppleton features:
Size of Human Resources Department
Statistics show that Poppleton has more people involved in managing other people than any other university of comparable size in this country. "
Universities and colleges that complain about the failure of rankings to acknowledge their unique qualities, which somehow are not only unquantifiable but also inexpressible, deserve to be mocked. But they are a rather easy target. Will Laurie Taylor ever have a go at the THES-QS rankings?
Tuesday, August 21, 2007
A number of American liberal arts colleges have refused to contribute to the reputational survey of the US News and World Report rankings. See insidehighered for the full story. This is a highly positive development since such surveys tend to be biased, self-confirming and opaque. The THES-QS "peer review" is perhaps the worst of the lot in these respects but other reputational surveys are probably little better.
Less positive is the news about Sarah Lawrence, a New York liberal arts college. This school no longer looks at the SAT scores of its applicants and therefore has been placed in the "unranked" category by USNWR, which counts SAT scores as a key indicator of student quality. There has been a fair bit of controversy about this but I doubt that Sarah Lawrence will suffer very much. The publicity will probably compensate for losing its place among the top liberal arts colleges.
Sarah Lawrence's action is, however, potentially very dangerous. The SAT is essentially an intelligence test and therefore is highly predictive of academic success and resistant to coaching. There is, it is true, a small scale industry devoted to boosting SAT scores but its claims are grossly exaggerated.
What will likely happen is that admission to Sarah Lawrence will be based on the evaluation of high school essays and performance in class including advanced placement courses and recommendations from teachers and counselors topped up with an array of interesting extra-curricular activities. It is more than likely that the admissions process will give an advantage to those whose parents can move to suburbs with good schools, provide a glut of stimulating activities that will be raw material for essays and provide advice, assistance, Internet access and transport for high school projects.
In short, ultimately admission to Sarah Lawrence -- and no doubt many other colleges eventually -- will be based on the ability to impress high school teachers and administrators and to have an interesting out of school life. In the end this is all far more dependent on parental wealth than an intelligence test.
If Sarah Lawrence's stand became widespread -- and it probably will -- then admission to many highly valued American colleges will be determined not by cognitive ability but by the social and communicative skills that can only be acquired by long and expensive socialisation.
What is baffling about this is that, as with the abolition of the 11-plus in Britain, such a development is led by intelligent and educated people people who surely must have gained enormously by the spread of standardised testing over the last century.
According to the Princeton Review they are:
1. Whitman College (Walla Walla, Washington)
2. Brown
3. Clemson university (South Carolina)
4. Princeton
5. Stanford
6. Tulsa
7. College of new Jerseay
8. Bowdoin (Maine)
9. Yale
10. Thomas Aquinas College (California)
I am sure that PR's methods could be argued about but it is striking that four of the universities on this list are also in the top ten of selective universities.
The Princeton Review has come out with a variety of rankings. One of them ranks US colleges by how hard they are to get into, which many would think is a good proxy for general quality. Here are the top ten.
1. Harvard
2. Princeton
3. MIT
4. Yale
5. Stanford
6. Brown
7. Columbia
8. Pennsylvania
9. Washington in St Louis
10. Caltech
Monday, August 20, 2007
The US News and World Report rankings of American colleges is out. A full report is here.
Briefly this is how they are produced:
"To rank colleges and universities, U.S. News first assigns schools to a group of their peers, based on the basic categories developed by the Carnegie Foundation for the Advancement of Teaching in 2006. Those in the National Universities group are the 262 American universities (164 public and 98 private) that offer a wide range of undergraduate majors as well as master's and doctoral degrees; many strongly emphasize research.
In each category, data on up to 15 indicators of academic quality are gathered from each school and tabulated. Schools are ranked within categories by their total weighted score."
The top ten are:1. Princeton
2. Harvard
3. Yale
4. Stanford
5. Pennsylvania
6. Caltech
7. MIT
8. Duke
9. Columbia
9. Chicago
Sunday, August 19, 2007
The full rankings, released earlier this year, are here.
Below are the top twenty.
1. Harvard
2. Berkeley
3. Princeton
4. Cambridge
5. Caltech
6. MIT
7. Stanford
8. Tokyo
9. UCLA
10. Oxford
11.Cornell
12. Columbia
13. Chicago
14. Colorado--Boulder
15. ETH Zurich
16. Kyoto
17. Wisconsin -- Madison
18. UC Santa Barbara
19. UC San Diego
20.Illinois -- Urbana Champaign
Friday, August 17, 2007
Shanghai Jiao Tong University has also released, earlier this year, research rankings in broad subject areas. First here are the top twenty for the social sciences. The full ranking is here. These rankings should be taken with a fair bit of salt since they are heavily biased towards economics and business studies. Credit is given for Nobel prizes although these are only awarded for economics while psychology and psychiatry are excluded. There are two categories of highly cited researchers, Social Sciences -- general and Economics/Business.
It would, I think, be better to refer to this as an economics and business ranking.
1. Harvard
2. Chicago
3. Stanford
4. Columbia
5. Berkeley
6. MIT
7. Princeton
8. Pennsylvania
9. Yale
10.Michigan -- Ann Arbor
11. New York Univ
12. Minnesota -- Twin Cities
13. Carnegie-Mellon
14. UCLA
15. Northwestern
16. Cambridge
17. Duke
18. Maryland -- College Park
19. Texas -- Austin
19 . Wisconsin-- Madison
Note that there is only one non-US university in the top twenty, Cambridge at number 16. The best Asian university is the Hebrew University in Jerusalem at 40. The best Australian university is ANU at 77-104 . There is no mainland Chinese university in the top 100. This is dramatically different from the picture shown by the THES peer review in 2006.
Thursday, August 16, 2007
The London Times (not the Times Higher Education Supplement) has just produced its Good University Guide for British Universities. It is based on eight criteria: student satisfaction, research quality, student staff ratio, services and facilities spend, entry standards, completion, good honours and graduate prospects.
Here are the top ten.
1. Oxford
2. Cambridge
3. Imperial College London
4. London School of Economics
5. St Andrews
6. University College London
7. Warwick
8. Bristol
9. Durham
10. Kings College London
I shall try and comment later but for the moment it's worth pointing out that there are some spectacular rises, Kings College, Exeter and City University. This immediately raises questions of the stability of the methods and the validity of the data.
Wednesday, August 15, 2007
Shanghai Jiaotong University has just released its 2007 Academic Ranking of World universities. The top 100 can be found here. The top 500 are here.
I shall add a few comments in a day or so. Meanwhile here are the top 20.
1 Harvard Univ
2 Stanford Univ
3 Univ California - Berkeley
4 Univ Cambridge
5 Massachusetts Inst Tech (MIT)
6 California Inst Tech
7 Columbia Univ
8 Princeton Univ
9 Univ Chicago
10 Univ Oxford
11 Yale Univ
12 Cornell Univ
13 Univ California - Los Angeles
14 Univ California - San Diego
15 Univ Pennsylvania
16 Univ Washington - Seattle
17 Univ Wisconsin - Madison
18 Univ California - San Francisco
19 Johns Hopkins Univ
20 Tokyo Univ
Tuesday, August 14, 2007
There is a web site, College Ranking Service, that is produced by "a non-profit organization dedicated to providing rankings of colleges in a manner suitable for students, university leaders, and tuition paying parents."
The home page says:
"We take our rankings seriously. Each college is painstakingly analyzed, as if under a microscope, for its flaws and degree of polish. The rankings found on www.rankyourcollege.com represent thousands of hours of research, and are updated annually or at the discretion of the Director.
The Board of the College Ranking Service, composed of Nobel Prize Winners and Captains of Industry, remains anonymous to ensure the integrity of the rankings.
The Director is also anonymous, however, rest assured that he is a prominent member of the academy and a professor of the highest regard at one of the most prestigious universities in the world."
There is also a disclaimer:"There is no such thing as the "College Ranking Service." But the hyperbole and baloney contained in this web site are not that different from equally silly, but maddeningly serious college ranking publications and web sites offered by the media.
It is a sham and a scam to try to rank the quality of universities like sports franchises. Media publications that do this should be laughed out of existence. They simply measure wealth ("The Classic Method" on this web site), which is something that is at best obtusely related to quality.
Regardless of their lack of validity, media-based college rankings are having a negative influence on higher education. Tuition paying parents and their children are swayed by the false prestige these rankings imply. The push to get into a "top ten" school has created added pressure on students to stuff their high school years with lofty sounding, but often meaningless accomplishments. It has been partly responsible for the rise of a college application industry that provides services (like SAT prep classes and college application consulting) of dubious worth."
CRS also describes its methodology:
"In the course of developing our methodology, we found that our rankings had unique properties. First, we noted a phenomenon well known in particle physics, but unheard of heretofore in ranking systems: a college, like a subatomic particle, could be two or more places at once. In other words, individual colleges could have multiple rankings!
Second, we noted the well known and by now passe Heisenberg phenomenon in our rankings: our rankings were influenced by our evaluation. The more we looked at them in great detail, the more variability we saw. Finally, we found a butterfly effect: small perturbations in our extensive data base resulted in significant changes in our rankings.
The combined influences of these phenomena we term the Kanoeddel effect, in honor of the Director's mother's Passover matzah balls, which even though they were made at the same time, had a wide range in density (from that of cotton balls to that of granite pebbles). In Yiddish, the word for "matzah ball" is "kanoeddel."
Because of the Kanoeddel effect, we note that our rankings are not static. Hitting the refresh button on your web browser will cause the Mighty Max to recompute the rankings, resulting in a slightly different order."
In the Guide to the World's Top Universities, we find a perfect example of the first property with the Technical University of Munich occupying two different places in the rankings and also, in one case, being located in Dortmund.The butterfly effect is illustrated perfectly by the data entry or transfer error that led to an incorrect figure for student faculty ratio for every university in the Guide.
Sunday, August 12, 2007
This blog was originally supposed to be about university ranking in general but is danger of turning into a catalogue of THES and QS errors. I shall try to move on to more varied topics in the future but here is an elaboration of an earlier post on the faculty student ratios in the book Guide to the World's Top Universities, published by QS Quacquarelli Symonds Ltd. (QS) and written by Nunzio Quacquarelli, a director of that company, and John O'Leary and Martin Ince, former and current THES`editors.
In the first column below I have arranged the universities in the THES-QS rankings in alphabetical order. The middle column consists of the student faculty ratio included in the 2006 World University Rankings published in the THES (top 200) and on the topuniversities website. The figure is derived from converting the scores out of 100 in the rankings to ratios and cross-checking with QS's figures for faculty and students at topuniversities. The right-hand column contains the student faculty ratio in the Guide's directory and in the profile of the top 100 universities.
The two figures are completely different. But if you go down three rows you will find the figure is identical or almost identical.
Thus Aachen has a ratio of 14.7 in the Guide. Go down three rows and you will find that in the rankings and at topunivsersities Aberystwyth has a ratio of 14.7.
So, presumably what happened is that someone was pasting data between files and slipped three rows. This simple mistake has resulted in over 500 errors in the Guide.
University | ranking | Guide |
Aachen | 12.4 | 14.7 |
Aarhus | 10.5 | 24.1 |
Aberdeen | 10.5 | 14.8 |
Aberystwyth | 14.7 | 15.1 |
Adelaide | 24.9 | 18.1 |
Adolfo Ibanez | 14.8 | 20.5 |
Airlangga | 15.1 | 12.3 |
Alabama | 18.1 | 10.4 |
Alberta | 20.5 | 13.1 |
Amsterdam | 12.4 | 16.1 |
Antwerp | 10.4 | 24.3 |
Aoyama Guiken | 13.1 | 13.6 |
Aristotelian | 16.1 | 15.4 |
Arizona State | 24.3 | 14.3 |
I have written to Ben Sowter, director of research at QS and to the author's at Martin Ince's address. So far the only response is an automated message indicating that the latter is away.
Friday, August 10, 2007
Another site worth reading on law school rankings is MoneyLaw. There is an interesting post on the US News and World Report's correction policy. It seems that if USNWR makes a mistake the rankings are corrected but if the school is responsible the underlying data but not the rankings are corrected.
MoneyLaw comments:
"I wouldn't call that grossly unfair. Academic research would perhaps demand more attention to setting the record straight, granted. But USN&WR's rankings hardly constitute academic research."
I would add that this policy seems dramatically better than that of most other rankings
State universities and colleges have come up with a plan to publish essential information on their web sites.
The National Association of State Universities and Land-Grant Colleges and the American Association of State Colleges and Universities are , according to the Wall Street Journal,
" designing a template for college Web sites that, for those that opt to use it, shows in standard format: (1) details about admission rates, costs and graduation rates to make comparisons simple; (2) results from surveys of students designed to measure satisfaction and engagement, and (3) results of tests given to a representative sample of students to gauge not how smart they were when they arrived, but how much they learned about writing, analysis and problem-solving between freshman and senior years.
The last one is the biggie. Participating schools will use one of three tests to gauge the performance of students with similar entering SAT scores at tasks that any college grad ought to be able to handle. One test, the Collegiate Learning Assessment, gives students some circumstance and a variety of information about it, and asks for short essays (no multiple choice) on solving a problem or analyzing a scenario. Under the state schools' proposed grading scale, 70% of the schools will report that students did "as expected," given their SATs. An additional 15% will report they did better or much better than expected, and 15% will report students did worse or much worse than expected."
This seems like a good idea. It could even go some way towards making commercial rankings redundant.
Saturday, August 04, 2007
I have just noticed, via Wikipedia, a world university ranking by the Research Centre for China Science Evaluation at Wuhan University that seems to be based on current research productivity. Since the website does not have an English version, it is not possible to comment very much about it at the moment. According to Wikipedia it " is based on Essential Science Indicators (ESI), which provides data of journal article publication counts and citation frequencies in over 11,000 journals around the world in 22 research fields". If anyone can look at the website and tell me what the research fields are and what period is covered I'd be grateful.
I noticed some errors in the rankings. Laval University is located in France not Canada, York in the US, not Canada, and Bern in Sweden. Ljubljana is listed as being in "Jugoslavia", a few years out of date.
If the rankers have assessed a broad range of subjects and if they have looked at a recent period and if their methods are valid they may have produced a ranking of research achievement that is more current than the Shanghai index which includes decades-old Nobel and Fields prize winners. The ranking gives low positions to Cambridge and Oxford confirming suspicions that their high rating by THES -QS is unjustified. Princeton and Yale (strengths in the humanities?) have relatively low places. So do Chicago (strength in the social sciences?) and Caltech.
There are some more surprises. Texas is at number 2. Maybe this represents a genuine advance or perhaps the presence of a large medical school has something to do with it. "Univ Washington" is at number 3. This most probably means Washington University in St Louis. Before getting too excited about this result I would like to be sure that there has been no confusion with The University of Washington, Washington State University and George Washington University.
Here are is the top 20 along with the scores. Harvard at the top gets 100. The full ranking can be found here.
1. Harvard 100
2. Texas 87.49
3. "Univ Washington" 72.39
4. Stanford 71.91
5. Johns Hopkins 71.45
6. UC Berkeley 70.76
7. UCLA 70.38
8. Michigan 69.11
9. MIT 68.62
10. Toronto 66,90
11. Wisconsin 64.83
12. Columbia 64.71
13. UC San Diego 64.54
14. Pennsylvania 64.42
15. Cambridge 62.93
16. Minnesota 62.80
17. Yale 62.20
18. Cornell 62.19
19 UC San Francisco 61.52
20. Duke 60.60
Friday, August 03, 2007
QS Quacquarelli Symonds seems to have bad luck with international students in Malaysia. Their topuniversities site has a piece on "Study Abroad in Malaysia" which states
"On the back of its enduring economic and industrial boom, Malaysia is trying hard to position itself as the Asian destination of choice for international students seeking to study abroad, and with some success. Currently there are around 50,000 students from 100 countries in Malaysian tertiary education, forming 20-30% of the student body - and the country wants to promote a multicultural image that reflects the country itself. "
The total number ot registered students in tertiary education in Malaysia is in fact about 732,000. International students therefore are well under ten per cent of tertiary students
According to the Kuala Lumpur New Straits Times, the vice-chancellor of Universiti Malaya has said that the university's international ranking
"should not be a target. Instead, UM’s main aim was to produce
quality work, she added"
Wednesday, August 01, 2007
Another blog worth looking at is Agoraphilia which, among other things, has posts on the US News and World Report law school rankings. One of them deals with University of Florida's receiving an erroneous and over-favourable rating by the USN&WR, apparently because it reported the LSAT scores and GPAs only for the fall 2005 intake and did not include those for the spring intake.
What most impresses me about this is that the Dean and Associate Dean of Florida's law school and Robert J. Morse, Director of Data Research at USN&WR, have replied promptly and at length to questions about ranking methods.
Friday, July 27, 2007
This is a very good page produced by Boston College with links to sites and articles on university rankings. For a start take a look at 'Playing with Numbers' by Nicholas Thompson.
Wednesday, July 25, 2007
One of the more interesting elements in the Guide to the World's Top Universities by John O'Leary, Nunzio Quacquarelli and Martin Ince, published by QS Quacquarelli Symonds at the end of 2006, is the information about student faculty ratio provided in the directory of over 500 universities and the profiles of the world's top 100 universities.
These are, even at first sight, not plausible: 590.30 students per faculty at Pretoria, 43.30 at Colorado State University, 18.10 at Harvard, 3.50 at Dublin Institute of Technology.
Scepticism is increased when the the Guide's data for student faculty ratio is correlated with that derived from the scores out of 100 for this measure in the 2006 rankings and cross-checked with the data on individual universities on QS's topuniversities site. The correlation for 517 universities is negligible at .057 and statistically insignificant (2-tailed .195).
Comparing the two sets of data on student faculty ratio for the British universities in the rankings shows that the problem is with the information in the Guide, not that in the rankings. The rankings data correlates highly with that provided by the Higher Education Statistics Agency (HESA: see earlier post) (.712, sig = .000) and that taken from the web site williseemy tutor (.812, sig = .000). There is no significant correlation between the data in the Guide and the HESA data (.133, sig = .389) and that derived from williseemytutor (.179, sig = .250).
So, where did the Guide's student faculty data come from?
First, here are the most favourable student faculty ratios calculated from the scores in the rankings (they can be cross-checked at the topuniversities site) and rounded to the first decimal place.
Duke 3.5
Yale 3.7
Eindhoven University of Technology 3.8
Rochester 3.8
London Imperial College 3.9
Paris Sciences Po 4.0
Tsing Hua 4.1
Emory 4.1
Geneva 4.3
Vanderbilt 4.3
Now, here are the most favourable ratios given in the Guide.
Dublin Institute of Technology 3.5
Wollongong 3.7
Ecole Polytechnique 3.8
Rio de Janeiro 3.8
Llubljanja 3.9
Oulu 4.0
Trento 4.1
Edinburgh 4.1
Fudan 4.3
Utrecht 4.3
Notice that the ratio of 3.5 is assigned to Duke university in the rankings and to Dublin IT in the Guide. If the universities are arranged alphabetically these two would be in adjacent rows. Likewise, the other scores listed above are assigned to universities that would be next to each other or nearly so in an alphabetical listing.
Next are the least favourable ratios derived from the rankings data.
Pune 580
Delhi 316
Tor Vergata 53
Bologna 51
Cairo 49
Concordia 42
Now the ratios in the Guide.
Pretoria 590
De La Salle 319
RMIT 53
Bilkent 51
Bucharest 49
Colorado 42
Notice again that, except for Tor Vergata and RMIT, the ratio in the two data sets is shared by universities that are close or next to each other alphabetically.
The conclusion is unavoidable. When the Guide was being prepared somebody created a new file and made a mistake, going down one or two or a few rows and inserting the rankings data in the wrong rows. So, every university in the Guide's directory acquired a new and erroneous student faculty ratio.
Since this piece of information is the one most likely to interest future undergraduate students, this is not a trivial error.
Is this error any less serious than QS's getting the two North Carolina business schools mixed up?
Sunday, July 22, 2007
The third Asia Pacific Professional Leaders in Education conference was held in Hong Kong recently. The conference was organised by QS Quacquarelli Symonds (QS), consultants for the THES rankings, and a substantial part of the proceedings seems to have been concerned with international university rankings. There is a report by Karen Chapman in the Kuala Lumpur Star. There are hints that the methods of the THES-QS rankings may be revised and improved this year. The QS head of research, Ben Sowter, has referred to a revision of the questionnaires and to an audit and validation of information. Perhaps the deficiencies of previous rankings will be corrected.
There is also a reference to a presentation by John O'Leary, former editor of the THES, who is reported as saying that
“Peer review is the centrepiece of the rankings as that is the way academic value is measured.”
The second part of this sentence is correct but conventional peer review in scientific and academic research is totally different from the survey that is the centrepiece of the THES rankings.
Peer review means that research is scrutinised by researchers who have been recognised as authorities in a narrowly defined research field. However, inclusion in the THES-QS survey of academic opinion has so far required no more expertise than the ability to sign on to the mailing list of World Scientific, a Singapore-based academic publisher. Those who are surveyed by QS are, in effect, allowed to give their opinions about subjects of which they may know absolutely nothing. Possibly, the reference to redesigning the survey means that it will become more like a genuine peer review.
It cannot be stressed too strongly or repeated too often that, on the basis of the information released so far by QS, the THES-QS survey is not a peer review.
There is an excellent post by Eric Beerkens at Beerkens' Blog reporting on an article by Wendy Nelson Espeland and Michael Sauder in the American Journal of Sociology. The article, 'Rankings and reactivity: How public measures recreate social worlds', describes how the law school rankings of the US News and World Report affect the behaviour of students, university administrators and others.
Beerkens argues that international university rankings also have several consequences
1. Firstly, rankings affect external audiences. Trivial differences between institutions may lead to large differences in the quality and quantity of applicants.
2. Rankings may amplify differences in reputations. If researchers or administrators are asked to assess universities of which they have no knowledge they are likely to rely on the results of previous rankings.
3. Resources such as grants distributed on the basis of rankings .
4. Universities will give up objectives that are not measured in rankings and try to become more like those who achieve high scores.
Saturday, July 21, 2007
There is a Spanish-language blog on university rankings and other academic matters by Alejandro Pisanty that is well worth looking at.
Tuesday, July 17, 2007
Matt Rayner has posted an interesting question on the QS topuniversities site. He has noticed that in the Guide to the World's Top Universities, published by QS, Cambridge is supposed to have a student faculty ratio of 18.9 and a score of 64 for this part of the 2006 World Rankings while Glasgow, with an almost identical ratio of 18.8, gets a score of 35.
As already noted, this anomaly is not confined to Cambridge and Glasgow. The student faculty ratios provided in the data about individual universities in the Guide are completely different from those given in the rankings.
There is in fact no significant relationship, as a quick correlation done by SPSS will show, between the two sets of data.
It will be even more interesting to see when and how QS reply to Matt's question