Wednesday, August 17, 2011

A Rising Crescent?

One advantage of the methodological stability of the Shanghai university rankings is that it is possible to identify long term changes even though the year by year changes may be quite small.

One trend that becomes apparent when comparing the 2003 and 2011 rankings is the increasing number of universities from predominantly Muslim countries.

In 2003 there was exactly one listed, Istanbul University.

This year there were six: King Saud University, Saudi Arabia, in the 201-300 band, King Fahd University of Petroleum and Minerals, Saudi Arabia, Tehran University and Istanbul University in the 301-400 band and Cairo University and Universiti Malaya in the 401-500.

In the next year or two King Abdullah University of Science and Technology, Saudi Arabia, headed by a former president of the National University of Singapore, will probably join the list.

Monday, August 15, 2011

Another Twist in the Plot

The  relationship between Malaysian universities and international rankers would make a good soap opera, full of break-ups, reconciliations and recriminations.

It started in 2004  when the first THES - QS ranking put Universiti Malaya (UM) in the top 100 and Universiti Sains Malaysia in the top 200. There was jubilation at the UM campus with triumphant banners all over the place. Then it all came crashing down in 2005 when QS revealed that they had made a mistake by counting ethnic minorities as international students and faculty. There followed a "clarification of data" and UM was expelled from the top 100.

The  Malaysian opposition believed, or pretended to believe, that this was evidence of the unrelenting decline of the country's universities. The Vice-Chancellor of UM went off into the academic wilderness but still remained on QS's advisory board.

UM continued to pursue the holy grail of a top 200 ranking by the vigorous  pursuit of publications and citations. There was discontent among the faculty voiced in a letter to a local newspaper:

"The writer claimed that many have left UM and “many more are planning to leave, simply because of the expectations from the management”.

“UM is not what it used to be. The academic staff are highly demoralised and unhappy due to the management’s obsession and fixation with ISI publications, while research, consultancy, and contribution to the nation, such as training of PhD students are considered as secondary,” the letter said."

In 2007 the Malaysian government asked for plans from universities to be considered for APEX (Accelerated Program for Academic Excellence) status which would include a substantial degree of university autonomy. It boiled down to a fight between UM and USM, which was won by USM apparently because of its inspiring plans.

'"The selection committee evaluated each university's state of readiness, transformation plan and preparedness for change. The university that is granted apex status is theone that has the highest potential among Malaysian universities to be world-class, and as such, would be given additional assistance to compete with top-ranking global institutions,‘addedKhaled. "Apex is about accelerated change. It is not about business as usual –but business unusual""USM has been working on its own transformation plan –We started with the ‘Healthy Campus’concept, before moving on to the‘University in a garden’concept. We subsequently adopted the ‘Research University’concept."Tan Sri DatoProf DzulkifliAbdul RazakVice Chancellor, USMSelection Committee Chairman, Dr. MohamadZawawi, former Vice Chancellor of Universiti Malaysia Sarawak said the committee also paid special attention to the institutions’strategic intent and transformation plans. Visits were made to short-listed institutions where discussions were held with senior staff, academicians, students and staff associations to understand the prevailing campus’‘climate’and factors related to the proposed plans.With apex status, USM will be given the autonomy to have the best in terms of governance, resources and talent and is expected to move up in the World University Rankings with a target of top 200 in five years and in the top 100, if not 50, by 2020.'

Note that USM was expected to use its status to climb the international rankings. However, it is now refusing to have anything to do with the rankings, something that is understandable.

The issue of which university deserves APEX was reopened this morning when it was announced that Universiti Malaya  was in the top 500 of the Shanghai Academic Ranking of World Universities.

This is unlikely to be a mistake like 2004. The Shanghai rankers have had methodological problems like what to do about merging or splitting universities but they do not change the basic methodology and they do not make serious mistakes. We are not going to hear next year about a clarification of data

UM's success is narrowly based. They have no Nobel prize winners, no highly cited researchers , only a handful of papers in Nature and Science but quite a lot of publications in ISI indexed journals. One might complain that there is too much emphasis on quantity but this is nevertheless a tangible achievement.





 






Sunday, August 14, 2011

Press release from Shanghai

Here is the press release from Shanghai Jiao Tong University giving more details about this year's rankings.

Monday, August 15, 2011
Shanghai, People's Republic of China
The Center for World-Class Universities of Shanghai Jiao Tong University released today the 2011 Academic Ranking of World Universities (ARWU), marking its 9th consecutive year of measuring the performance of top universities worldwide.
Harvard University tops the 2011 list; other Top 10 universities are: Stanford, MIT, Berkeley, Cambridge, Caltech, Princeton, Columbia, Chicago and Oxford. In Continental Europe, ETH Zurich (23rd) in Switzerland takes first place, followed by Paris-Sud (40th) and Pierre and Marie Curie (41st) in France. The best ranked universities in Asia are University of Tokyo (21st) and Kyoto University (24th) in Japan.
Three universities are ranked among Top 100 for the first time in the history of ARWU: University of Geneva (73rd), University of Queensland (88th) and University of Frankfurt (100th). As a result, the number of Top 100 universities in Switzerland, Australia and Germany increases to 4, 4 and 6 respectively.
Ten universities first enter into Top 500, among them University of Malaya in Malaysia and University of Zagreb in Croatia enable their home countries to be represented, together with other 40 countries, in the 2011 ARWU list.
Progress of universities in Middle East countries is remarkable. King Saud University in Saudi Arabia first appears in Top 300; King Fahd University of Petroleum & Minerals in Saudi Arabia, Istanbul University in Turkey and University of Teheran in Iran move up in Top 400 for the first time; Cairo University in Egypt is back to Top 500 after five years of staggering outside.
The number of Chinese universities in Top 500 increases to 35 in 2011, with National Taiwan University, Chinese University of Hong Kong, and Tsinghua University ranked among Top 200.
The Center for World-Class Universities of Shanghai Jiao Tong University also released the 2011 Academic Ranking of World Universities by Broad Subject Fields (ARWU-FIELD) and 2011 Academic Ranking of World Universities by Subject Field (ARWU-SUBJECT).Top 100 universities in five broad subject fields and in five selected subject fields are listed, where the best five universities are:
Natural Sciences and Mathematics – Harvard, Berkeley, Princeton, Caltech and Cambridge
Engineering/Technology and Computer Sciences – MIT, Stanford, Berkeley, UIUC and Georgia Tech
Life and Agriculture Sciences – Harvard, MIT, UC San Francisco, Cambridge and Washington (Seattle)
Clinical Medicine and Pharmacy – Harvard, UC San Francisco, Washington (Seattle), Johns Hopkins and Columbia
Social Sciences – Harvard, Chicago, MIT, Berkeley and Columbia
Mathematics – Princeton, Harvard, Berkeley, Stanford and Cambridge
Physics – MIT, Harvard, Caltech,Princeton and Berkeley
Chemistry – Harvard, Berkeley, Stanford, Cambridge and ETH Zurich
Computer Science – Stanford, MIT, Berkeley, Princeton and Harvard
Economics/Business – Harvard, Chicago, MIT, Berkeley and Columbia
The complete listsand detailed methodologies can be found at the Academic Ranking of World Universities website at http://www.ShanghaiRanking.com/.
Academic Ranking of World Universities (ARWU): Starting from 2003, ARWU has been presenting the world top 500 universities annually based on a set of objective indicators and third-party data. ARWU has been recognized as the precursor of global university rankings and the most trustworthy list. ARWU uses six objective indicators to rank world universities, including the number of alumni and staff winning Nobel Prizes and Fields Medals, number of highly cited researchers selected by Thomson Scientific, number of articles published in journals of Nature and Science, number of articles indexed in Science Citation Index - Expanded and Social Sciences Citation Index, and per capita performance with respect to the size of an institution. More than 1000 universities are actually ranked by ARWU every year and the best 500 are published.
Center for World-Class Universities of Shanghai Jiao Tong University (CWCU): CWCU has been focusing on the study of world-class universities for many years, published the first Chinese-language book titled world-class universities and co-published the first English book titled world-class universities with European Centre for Higher Education of UNESCO. CWCU initiated the "International Conference on World-Class Universities" in 2005 and organizes the conference every second year, which attracts a large number of participants from all major countries. CWCU endeavors to build databases of major research universities in the world and clearinghouse of literature on world-class universities, and provide consultation for governments and universities.
Contact: Dr.Ying CHENG at ShanghaiRanking@gmail.com
Breaking News

The Shanghai rankings are out. Go here.

One interesting result is that Universiti Malaya is in the top 500 for the first time, mainly because of  a large number of publications.

Wednesday, August 10, 2011

America's Best Colleges

As the world waits anxiously for the publication of Princeton Review's Stone Cold Sober School Rankings (like everybody else I am praying the US Air Force Academy stays in the top ten), there are a few less important rankings like Shanghai's ARWU or the Forbes/CCAP (Center for College Affordabilty and Productivity) Rankings to study.

The latter, which have just been released, are designed for student consumers. The components are student satisfaction, post-graduate success, student debt, four-year graduation rate and competitive awards. They clearly fulfill a need that other rankings do not. It is possible that some of the indicators could be adopted for an international ranking. The top 10, a mix of Ivy League schools, small liberal arts colleges and service academies, are:

1.  Williams College
2.  Princeton
3.  US Military Academy
4.  Amherst College
5. Stanford
6.  Harvard
7.  Haverford College
8.  Chicago
9.  MIT
10. US Air Force Academy

Tuesday, August 09, 2011

The Missing Indicator

There is an interesting item in Times Higher Education. Apparently leading universities might be offering financial inducements in the form of scholarships to students with AAB at A level. For Americans that would be something like a 3.9 GPA.

Universities like to proclaim that they provide something to students that enables them to succeed in the post university world, training in critical thinking, soft skills or exposure to a diverse multicultural society for which massive tuition fees can be extracted from parents or government.  Whether they actually do is debatable. A book length study by Richard Arum and Josipa Roksa, Academically Adrift,  indicated that universities do little to teach students how to think.

Employers seem to have a rather different view of the matter. Many restrict themselves to recruiting from only the elite universities and are quite unconcerned with whether universities have established learning outcomes or whether they have created a safe space for diverse students. They simply wish to recruit the most intelligent students that they  can, with perhaps a bit of charm and likability for publicly visible positions.

Left to their own devices, universities would probably do their best to admit the most intelligent students available. The news that some are willing to pay for those with good A levels is a stark reminder that they are not being entirely honest in claiming that they provide an excellent education that is worth paying for. If that were the case, why not just put in a bit more effort and a few thousand pounds to turn an BBB or CCC student into an AAB. The answer is that while you might get a bit more out of a student by teaching reading and writing skills, basic numeracy and so on, recruiting from the top of the cognitive scale will bring much more to a university.

It is likely that calculating the average academic ability of students, or better still, their underlying intelligence might be an extremely valuable component in any international ranking system

Research Blogs has a table, derived from data from the Higher Education Funding Council of England,  of universities according to the percentage of UK students with AAB or better at A levels.

If it were possible to calculate equivalences between the standardised tests or qualifications in various countries with some sort of adjustment for national differences in education or literacy, then an global ranking of universities according to student quality would be quite feasible.


The second column in this table shows the position of the institution when ranked by size.
Ranked by percentage AAB+ UK %
1 3 University of Oxford 2568 99
2 4 University of Cambridge 2554 99
3 18 Imperial College London 1094 96
4 23 London School of Economics 686 93
5 2 University of Durham 2581 85
6 8 University of Bristol 2199 85
7 13 University College London 1648 82
8 9 University of Warwick 2068 81
9 7 University of Exeter 2368 74
10 15 University of Bath 1496 69
11 17 King's College 1238 68
12 57 Royal Veterinary College 195 60
13 86 Conservatoire for Dance & D 71 59
14 35 SOAS 353 57
15 5 University of Nottingham 2505 57
16 12 University of Southampton 1686 54
17 14 University of York 1538 53
18 1 University of Manchester 2776 51
19 11 University of Sheffield 1846 49
20 10 University of Birmingham 1883 48
21 6 University of Leeds 2376 47
22 16 University of Newcastle 1332 43
23 19 Loughborough University 1042 38
24 91 School of Pharmacy 59 37
25 21 Lancaster University 740 32
26 25 Royal Holloway, London 589 32
27 20 University of Liverpool 886 32
28 33 City University 414 31
29 26 University of Leicester 578 30
30 22 Queen Mary 702 29
31 24 University of Sussex 630 29
32 32 Aston University 429 25
33 27 University of East Anglia 538 25
34 93 Blackpool and the Fylde 52 24
35 30 University of Surrey 433 23
36 95 Blackburn College 51 22
37 29 University of Reading 455 19
38 58 Goldsmiths College 180 17
39 72 University of Chichester 114 13
40 37 Brunel University 325 12
41 84 University College Falmouth 78 11
42 31 University of the Arts London 430 11
43 28 University of Kent 480 10
44 76 University of Cumbria 105 10
45 89 Arts UC at Bournemouth 67 9
46 67 Bath Spa University 144 9
47 55 University of Essex 200 8
48 56 University of Teesside 197 8
49 43 Southampton Solent 254 8
50 69 University Creative Arts 136 8
51 48 University of Lincoln 232 8
52 68 University of Worcester 142 8
53 80 Liverpool Hope University 97 8
54 42 Coventry University 262 7
55 50 Bournemouth University 227 7
56 54 Oxford Brookes University 213 7
57 63 University of Bedfordshire 163 7
58 64 University of East London 157 7
59 79 Edge Hill University 97 7
60 51 University of Brighton 215 7
61 39 University of Northumbria 283 7
62 41 University of Plymouth 278 7
63 62 London South Bank 165 7
64 60 Birmingham City University 170 6
65 75 University of Northampton 110 6
66 36 Sheffield Hallam University 340 6
67 65 Anglia Ruskin University 156 6
68 34 Nottingham Trent University 357 6
69 49 University of Hull 228 6
70 78 Keele University 99 6
71 77 University of Chester 100 6
72 52 University of Huddersfield 214 6
73 40 Kingston University 280 6
74 44 University West of England 238 5
75 46 University of Westminster 235 5
76 70 University of Derby 133 5
77 92 University of Bolton 55 5
78 73 University of Bradford 112 5
79 87 Thames Valley University 68 5
80 38 Manchester Metropolitan 290 5
81 47 De Montfort University 232 5
82 45 London Metropolitan 237 5
83 53 Liverpool John Moores 213 5
84 71 Middlesex University 126 5
85 61 Central Lancashire 168 4
86 94 Roehampton University 51 4
87 83 University of Sunderland 79 4
88 90 University of Gloucestershire 65 4
89 59 University of Greenwich 178 3
90 66 University of Portsmouth 152 3
91 74 Leeds Metropolitan 111 3
92 88 University of Salford 67 3
93 85 University of Wolverhampton 76 2
94 82 University of Hertfordshire 85 2
95 81 Staffordshire University 91 2





















Monday, August 08, 2011

Worth Reading

University World News has some new articles on rankings by Ellen Hazelkorn, Philip Altbach and Danny Byrne of QS. It also provides links to several older articles.

Tuesday, August 02, 2011

Ranking Fever Rages in the US

Forget Shanghai, QS and Times Higher. The really important ranking has just been released by Princeton Review.

Ohio University in Athens has replaced the University of Georgia in Athens as the top party school (one wonders if all the respondents could remember where they were).

Other number ones are:

Amazing College Campus: Elon University, NC
Top Online University: Penn State University World Campus 

And more to come.

Sunday, July 31, 2011

Latest Webometrics Rankings

Webometrics have just released their latest rankings. These are based on the web-related activities of universities as measured by:
  • the number of pages recovered from four engines: Google, Yahoo, Live Search and Exalead
  • the total number of unique external links received (inlinks)
  • rich files in Adobe Acrobat (.pdf), Adobe PostScript (.ps), Microsoft Word (.doc) and Microsoft Powerpoint (.ppt).
  • data were extracted using Google results from the Scholar database representing papers, reports and other academic items
The Webometrics ranking might be considered a crude instrument but nonetheless it does measure something that, while not synonymous with quality, is still a necessary precondition.

Here are the top three in each region:

USA and Canada
1. MIT
2. Harvard
3. Stanford

Latin America
1.  Sao Paulo
2.  National Autonmous University of Mexico
3.  Federal University of Rio Grande do Sul, Brazil

Europe
1.  Cambridge
2.  Oxford
3.  Southampton

Central and Eastern Europe
1.  Charles University in Prague
2.  Masaryk University in Brno
3.  Ljubljana, Slovenia

Asia
1.  National Taiwan University
2.  Tokyo
3.  Kyoto

South East Asia
1.  National University of Singapore
2.  Kasetsart, Thailand
3.  Chulalongkorn, Thailand

South Asia
1.  IIT Bombay
2.  IIS Bangalore
3.  IIT Kanpur

Arab World
1.  King Saud University
2.  King Fahd University of Petroleum and minerals
3.  Kng Abdul Aziz University

Oceania
1.  Australian National University
2.   Melbourne
3.  Queensland

Africa
1.  Cape Town
2.  Pretoria
3.  Stellenbosch

Tuesday, July 19, 2011

Rankings as Imperialism

A conference was held in Malaysia recently, ostensibly to "challenge Western stereotypes of knowledge."

There was a comment on international university rankings by James Campbell of Universiti Sains Malaysia.

"Others warn of the threats of new colonialism practices such as rankings exercises.

“This is another form of imperialism as universities have to conform with publishing in ISI (Institute for Scientific Information) journals in order to be ranked among the best in the world,” says Campbell."


There are many things wrong with rankings but this is not a valid criticism. The Shanghai rankings have shown the steady advance of Chinese and Korean and to a lesser extent Latin American and Southwest Asian universities. The QS rankings (formerly THE -QS) were notoriously biased towards Southeast  Asia with a heavy weighting being given to a survey originally based largely on the mailing lists of a Singapore based publishing company (that may no longer be the case) .

As for the the current THE - Thomson Reuters rankings, they have declared an Egyptian university to be the fourth best in the world for research impact.

The inadequacies of current rankings have been discussed here and elsewhere. But whether it is helpful to anyone to reject them altogether is very debatable.

Most of the conference was devoted not to rankings per se. but to supposed critiques of western science. Readers may judge these for themselves.

Sunday, July 17, 2011

Pseudo-science in the academy

A comment by Ameen Amjad Khan in University World News draws attention to the continuing problem of pseudo-science in universities. He lists creationism. anti-evolutionism, magnetic healing, perpetual motion, quantum mysticisms, New Age physics, parapsychology, repressed memory, homeopathy and fake self-help schemes.

To which we could add some products of pseudo-social science such as multiple intelligences, emotional and spiritual quotient, Outcomes Based Education and just about anything related to management studies.
Off topic a bit
The Independent has an article by Alex Duval Smith, "the man who proved that everyone is good at maths"

It describes a French academician, Marc Chemillier, who has written a book , "Les Mathematiques Naturelles" that claims that maths is simple and rooted in human sensory intuition. He has travelled to Madagascar because "he believes that Madagascar's population, which remains relatively untouched by outside influences, can help him to prove this".

Smith quotes Chemillier as saying: "There is a strong link between counting and the number of fingers on our hands. Maths becomes complicated only when you abandon basic measures in nature, like the foot or the inch, or even the acre, which is the area that two bulls can plough in a day."

Ploughing a field with bulls is natural? Isn't that a little ethnocentric and chronocentric?

Smith goes on:

"To make his point, Mr Chemillier chose to charge up his laptop computer, leave Paris and do the rounds of fortune tellers on the Indian Ocean island [Madagascar] because its uninfluenced natural biodiversity also extends to its human population. Divinatory geomancy – reading random patterns, or sikidy to use the local word – is what Raoke does, when not smoking cigarettes rolled with paper from a school exercise book."

The idea that the population of Madagascar is untouched, even relatively,  by outside influences is rather odd. The ancestors of the Malagasy travelled across the Indian Ocean from Borneo, a voyage more arduous than those of Columbus. Since then, the island has received immigrants and ideas from and traded with East Africa, Arabia, Persia, India and Europe. Sikidy itself is a local adoption of the medieval Muslim art of divination, adapted to local conditions.

It is difficult to see how Raoke's abilty to recall complex patterns created by removing seeds in ones or twos from piles proves that everybody is good at maths. He has probably been divining for half a century and it is a safe bet that he has put in the ten thousand hours that Malcolm Gladwell thinks is necessary to turn anyone into a genius.

I suspect, however, that we are going to hear  more about the diviners of Madagascar as universities and schools throughout the world are relentlessly dumbed down. No need to study the needless complexities of calculus: a pile of seeds and illiterate intuition is all you need.

Thursday, July 14, 2011

What do you do if you hold a quality initiative and nobody comes?

The Higher Education Commission of  Pakistan is proposing to rank all private and public universities in the country. Unfortunately, the universities do not seem very keen on the idea and most of them are not submitting data.


"An official of HEC told APP that all public and private Higher Education Institutions (HEIs) were asked to submit their data by July 15 for carrying out ranking process but around 10 out of 132 universities have submitted their data. HEC is taking the initiative of ranking the universities to help strengthen their indigenous quality culture and improve international visibility. The HEC has already directed the universities to meet the deadline for providing authentic data and those which failed to provide data will be ranked zero by placing them at the bottom in the ranking list to be published through print media.

HEC’ initiative to carry out quality based ranking of Higher Education Institutions (HEIs) is aimed at international compatibility, primarily based on the QS Ranking System acceptable widely across countries. The commission has taken various initiatives to bring HEIs of Pakistan at par with international standards and ranking is one of the measures to scale the success of efforts to achieve international competitiveness in education, research and innovation."


The problem of conscientious objectors and of universities that might simply not be able to collect data is one that has plagued global and national rankers from the beginning. Times Higher and Thomson Reuters allow universities to opt out but that is risky if those opting out include the likes of Texas at Austin. On the other hand, QS will collect data from third party and national sources if universities fail to cooperate.

Wednesday, July 13, 2011

What Global Rankings Ignore
(at least some of them)

Inside Higher Ed has an article by Indira Samarasekera, president and vice-chancellor of the University of Alberta, that voices some fairly conventional complaints about international university rankings. She has some praise for two of the rankers:

"The problems with national and international rankings are numerous and well known. So well known, in fact, that the world’s most powerful ranking organizations — the World’s Best Universities Rankings conducted by U.S. News & World Report in partnership with Quacquarelli Symonds and the Times Higher Education Rankings — have been working diligently to revise ranking measures and their methods in an attempt to increase the accuracy and objectivity of the rankings.

It should be pointed out that U.S. News & World report does not conduct any world rankings: it just publishes those prepared by QS. And I wonder how successful those diligent attempts will be.

She goes on:

"From my perspective, rankings are also missing the mark by failing to shine a light on some of the most significant benefits that universities bring to local, national and global societies. The focus of most rankings is on academic research outputs — publications, citations and major awards — that stand in as proxies for research quality and reach. While these outputs do a fairly good job of pinpointing the impact of a university’s contributions to knowledge, especially in science, technology, engineering and health sciences, they provide little indication of what kind of impact these advancements have on factors that the global community generally agrees are markers of prosperous and secure societies with a high quality of life.

Let me give you an example of what I mean: governments and policy makers everywhere now consider universities as economic engines as well as educational institutions. Public investments in research are increasingly directed toward research with the potential to translate into products, processes and policies — even whole new industries. This trend in research funding reveals a lot about the ways in which universities matter to governments, policy makers, regions and the public today, but the rankers aren’t paying attention.

Consider Israel. According to data on NASDAQ’s website, Israel has more companies listed on the NASDAQ stock exchange than any other country in the world except the U.S., and major companies such as Intel, Microsoft, IBM and Google have major research and development centers in Israel. Why? If you look at the data, you see a correlation between this entrepreneurial activity and the investments in and outputs from Israel’s universities.

Israel is among a handful of nations with the highest public expenditure on educational institutions relative to GDP, and it has the highest rate of R&D investment relative to GDP in the world. It also has the highest percentage of engineers in the work force and among the highest ratio of university degrees per capita. Many of the companies listed on NASDAQ were started by graduates of Israel’s universities: Technion, Tel Aviv University, Weizmann Institute and Hebrew University of Jerusalem, to mention a few. Do international university rankings capture these economic impacts from research and postsecondary education in Israel? The answer is no. In spite of their tremendous impact and output, Israel’s universities are ranked somewhere in the 100 to 200 range."

In fact, the Shanghai rankings had the Hebrew University of Jerusalem in 72nd position in 2010 and the percentage of Israeli universities in the Shanghai 500 was higher than any other country. So, the vice-chancellor's logic leads to the conclusion that Shanghai does at a better job at capturing this aspect of excellence than QS or THE.

Tel Aviv University and the Hebrew University of Jerusalem were not in the THE 200 or indeed the THE top 400. What happened is that Thomson Reuters either did not receive or did not ask for the information.

Tuesday, July 12, 2011

This WUR had such promise

The new Times Higher Education World University Rankings of 2010 promised much, new indicators based on income, a reformed survey that included questions on postgraduate teaching, a reduction in the weighting given to international students.

But the actual rankings that came out in September were less than impressive.  Dividing the year's intake of undergraduate students by the total of academic faculty looked rather odd. Counting the ratio of doctoral students to undergraduates, while omitting masters programs, was an invitation to the herding of marginal students into substandard doctoral degree programmes.

The biggest problem though was the insistence on giving a high weighting – somewhat higher than originally proposed -- to citations. Nearly a third of the total weighting was assigned to the average citations per paper normalized by field and year. The collection of statistics about citations is the bread and butter of Thomson Reuters (TR), THE’s  data collector, and one of their key products is the Incites system, which apparently was the basis for their procedure during the 2010 ranking exercise. This compares the citation records of academics with international scores benchmarked by year and field. Of course, those who want to find out exactly where they stand have to find out what the benchmark scores are and that is something that cannot be easily calculated without Thomson Reuters.

Over the last two or three decades the number of citations received by papers, along with the amount of money attracted from funding agencies, has become an essential sign of scholarly merit. Things have now reached the point where, in many universities, research is simply invisible unless it has been funded by an external agency and then published in a journal noted for being cited frequently by writers who contribute to journals that are frequently cited. The boom in citations has begun to resemble classical share and housing bubbles as citations acquire an inflated value increasingly detached from any objective reality.

It has become clear that citations can be manipulated as much as, perhaps more than, any other indicator used by international rankings. Writers can cite themselves, they can cite co-authors, they can cite those who cite them. Journal editors and reviewers can  make suggestions to submitters about who to cite. And so on.

Nobody, however, realized quite how unrobust citations might become until the unplanned intersection of THE’s indicator and a bit of self citation and mutual citation by two peripheral scientific figures raised questions about the whole business.

One of these two was Mohamed El Naschie who comes from a wealthy Egyptian family. He studied in Germany and took a Ph D in engineering at University College London. Then he taught in Saudi Arabia while writing several papers that appear to have been of an acceptable academic standard although not very remarkable. 

But this was not enough. In 1993 he started a new journal dealing with applied mathematics and theoretical physics called Chaos, Solitons and Fractals (CSF), published by the leading academic publishers, Elsevier. El Naschie’s journal published many papers written by himself. He has, to his credit, avoided exploiting junior researchers or insinuating himself into research projects to which he has contributed little. Most of his papers do not appear to be research but rather theoretical speculations many of which concern the disparity between the mathematics that describes the universe and that which describes subatomic space and suggestions for reconciling the two.

Over the years El Naschie has listed a number of universities as affiliations. The University of Alexandra was among the most recent of them. It was not clear, however, what he did at or for the university and it was only recently, after the publication of the 2010 THE World University Rankings, that there is documentation of any official connection.

El Naschie does not appear to be highly regarded by physicists and mathematicians, as noted earlier in this blog,  and he has been criticized severely in the physics and mathematics blogosphere.  He has, it is true, received some very vocal support but he is not really helped by the extreme enthusiasm and uniformity of style of his admirers. Here is a fairly typical example, from the comments in Times Higher Education: 
“As for Mohamed El Naschie, he is one of the most original thinkers of our time. He mastered science, philosophy, literature and art like very few people. Although he is an engineer, he is self taught in almost everything, including politics. Now I can understand that a man with his charisma and vast knowledge must be the object of envy but what is written here goes beyond that. My comment here will be only about what I found out regarding a major breakthrough in quantum mechanics. This breakthrough was brought about by the work of Prof. Dr. Mohamed El Naschie”
Later, a professor at Donghua University, China, Ji-Huan He, an editor at El Naschie’s  journal, started a similar publication, the International Journal of Nonlinear Sciences and Numerical Simulation (IJNSNS), whose editorial board included El Naschie. This journal was published by the respectable and unpretentious Israeli company, Freund of Tel Aviv. Ji-Huan He’s journal has published 29 of his own papers and 19 by El Naschie. The  two journals have contained articles that cite and are cited by articles in the other. Since they deal with similar topics some degree of cross citation is to be expected but here it seems to be unusually large.

Let us look at how El Naschie worked. An example is his paper, ‘The theory of Cantorian spacetime and high energy particle physics (an informal review)’, published in Chaos, Solitons and Fractals,41/5, 2635-2646, in  September  2009.

There are 58 citations in the bibliography. El Naschie cites himself 24 times, 20 times to papers in Chaos, Solitons and Fractals and 4 in IJNSNS.  Ji-Huan He is cited twice along with four  other authors from CSF. This paper has been cited 11 times, ten times in CSF in issues of the journal published later in the year.

Articles in mathematics and theoretical physics do not get cited very much. Scholars in those fields prefer to spend time thinking about an interesting paper before settling down to comment. Hardly any papers get even a single citation in the same year. Here we have 10 for one paper. That might easily be 100 times the average for that discipline and that year.

The object of this exercise had nothing to do with the THE rankings. What it did do was to push El Naschie’s  journal into the top ranks of scientific journals as measured by the Journal Impact Factor, that is the number of citations per paper within a two year period. It also meant that for a brief period El Naschie was listed by Thomson Reuters’ Science Watch as a rising star of research.

Eventually, Elsevier appointed a new editorial board at CSF that did not include El Naschie. The journal did however continue to refer to him as the founding editor. Since then the number of citations has declined sharply.

Meanwhile, Ji-huan  He was also accumulating a large number of citations, many of them from conference proceedings that he had organized. He was launched into the exalted ranks of the ISI Highly Cited Researchers and his journal topped the citation charts in mathematics. Unfortunately, early this year Freund sold off its journals to the reputed German publishers De Gruyter, who appointed a new editorial board that did not include either him or El Naschie.

El Naschie, He and a few others have been closely scrutinized by Jason Rush, a mathematician formerly of the University of Washington. Rush was apparently infuriated by El Naschie s unsubstantiated claims to have held senior positions at a variety of universities including Cambridge, Frankfurt, Surrey and Cornell. Since 2009 he has closely, perhaps a little obsessively, published a blog that chronicles the activities of El Naschie and those associated with him. Most of what is known about El Naschie and He was unearthed by his blog, El Naschie Watch.

Meanwhile, Thomson Reuters were preparing their analysis of citations for the THE rankings. They used the Incites system and compared the number of citations with benchmark scores representing the average for year and field.
This meant that for this criterion a high score did not necessarily represent a large number of citations. It could simply represent more citations than normal in a short period of time in fields where citation was infrequent and, perhaps more significantly since we are talking about averages here, a small total number of publications. Thus, Alexandria, with only a few publications but listed as the affiliation of an author who was cited much more frequently than usual in theoretical physics or applied mathematics, did spectacularly well.


This is rather like declaring Norfolk (very flat according to Oscar Wilde) the most mountainous county in England because of a few hillocks that were nonetheless relatively much higher than the surrounding plains.

Thomson Reuters would have done themselves a lot of good if they had taken the sensible course of using several indicators of research impact, such as total citations, citations per faculty, the h-index or references in social media or if they had allocated a smaller weighting to the indicator or if they had imposed a reasonable  threshold number of publications instead of just 50 or if they had not counted self-citations, or citations within journals or if they had figured out a formula to detect mutual citations..

So, in September  THE published its rankings with University of Alexandria in the top 200 overall and in fourth place for research impact, ahead of Oxford, Cambridge and most of the Ivy league. Not bad for a university that had not even been counted by HEEACT, QS or the Shanghai rankings and that in 2010 had lagged behind two other institutions in Alexandria itself in Webometrics.

When the rankings were published THE pointed out that Alexandria had once had a famous library and that a former student had gone on to the USA to eventually win a Nobel prize decades later. Still, they did concede that the success of Alexandria was mainly due  to one "controversial" author.

Anyone with access to the Web of Science could determine in a minute precisely who the controversial author was. For a while it was unclear exactly how a few dozen papers and a few hundred citations could put Alexandria among the world’s elite. Some observers wasted time wondering if  Thomson Reuters had been counting papers from a community college in Virginia or Minnesota, a branch of the Louisiana State University or federal government offices in the Greater Washington area. Eventually, it was clear that El Naschie could not, as he himself asserted, have done it by himself: he needed the help of the very distinctive features of Thomson Reuters’ methodology.

There were  other oddities in the 2010 rankings. Some might have accepted a high placing for Bilkent University in Turkey. It was well known for its Academic English programs. It also had one much cited article whose apparent impact was increased because it was classified as multidisciplinary, usually a low cited category, thereby scoring well above the world benchmark. However, when regional patterns were analyzed, the rankings began to look rather strange, especially the research impact indicator. In Australia, the Middle East, Hong Kong and Taiwan the order of universities, looked rather different from what local experts expected. Hong Kong Baptist University the third best in the SAR? Pohang University of Science and Technology so much better than Yonsei or KAIST? Adelaide the fourth best Australian university?

In the UK or the US these placings might seem plausible or at least not worth bothering about. But in the Middle East the idea of Alexandria as top university even in Egypt is a joke and the places awarded to the others look very dubious.

THE and Thomson Reuters tried to shrug off the complaints by saying that there were just a few outliers which they were prepared to debate and that anyone who criticized them had a vested interest in the old THE-QS rankings which had been discredited. They  dropped hints that the citations indicator would be reviewed but so far nothing specific has emerged.

A few days ago, however,  Phil Baty of THE seemed to imply that there was nothing wrong with the citations indicator.
Normalised data allow fairer comparisons, and that is why Times Higher Education will employ it for more indicators in its 2011-12 rankings, says Phil Baty.
One of the most important features of the Times Higher Education World University Rankings is that all our research citations data are normalised to take account of the dramatic variations in citation habits between different academic fields.
Treating citations data in an “absolute manner”, as some university rankings do, was condemned earlier this year as a “mortal sin” by one of the world’s leading experts in bibliometrics, Anthony van Raan of the Centre for Science and Technology Studies at Leiden University. In its rankings, Times Higher Education gives most weight to the “research influence” indicator – for our 2010-11 exercise, this drew on 25 million citations from 5 million articles published over five years. The importance of normalising these data has been highlighted by our rankings data supplier, Thomson Reuters: in the field of molecular biology and genetics, there were more than 1.6 million citations for the 145,939 papers published between 2005 and 2009; in mathematics, however, there were just 211,268 citations for a similar number of papers (140,219) published in the same period.
To ignore this would be to give a large and unfair advantage to institutions that happen to have more provision in molecular biology, say, than in maths. It is for this crucial reason that Times Higher Education’s World University Rankings examine a university’s citations in each field against the global average for that subject.

Yes, but when we are assessing hundreds of universities in very narrowly defined fields we start running into quite small samples that can be affected by deliberate manipulation or by random fluctuations.

Another point is that if there are many more journals, papers, citations and grants in oncology or genetic engineering than in the spatialization of gender performativity or the influence of Semitic syntax on Old Irish then perhaps society is telling us something about what it values and that is something that should not be dismissed so easily.

So, it could be  we are going to get the University of Alexandria in the top 200 again, perhaps joined by Donghua university.

At the risk of being repetitive, there are a few simple  things that Times Higher  and TR could do to make the citations indicator more credible. There are also  more ways of measuring research excellence.Possibly they are thinking about them but so far there is no sign  of this.

The credibility of last year's rankings has  declined further with the decisions of the judge presiding over the libel case brought by El Naschie against Nature (see here for commentary). Until now it could be claimed that El Naschie was a wll known scientist by virtue of the large numbers of citations that he had received or at least an interesting and controversial maverick.

El  Naschie is pursuing a case against  Nature for publishing an article that suggested his writings were not of a high quality and that those published in his journal did not appear to be properly peer reviewed

The judge has recently ruled  ruled that  El Naachie cannot proceed with a claim for specific damages since he has not brought any evidence for this. He can only go ahead with a claim for general damages for loss of reputation and hurt feelings. Even here, it looks like it will be tough going. El Naschie seems to be unwilling or unable to find expert witnesses to testify to the scientific merits of his papers.

"The Claimant is somewhat dismissive of the relevance of expert evidence in this case, largely on the basis that his field of special scientific knowledge is so narrow and fluid that it is difficult for him to conceive of anyone qualifying as having sufficient "expert" knowledge of the field. Nevertheless, permission has been obtained to introduce such evidence and it is not right that the Defendants should be hindered in their preparations."

He also seems to have problems with locating records that would demonstrate that his many articles published in Chaos, Solitons and Fractals were adequately reviewed.
  1. The first subject concerns the issue of peer-review of those papers authored by the Claimant and published in CSF. It appears that there were 58 articles published in 2008. The Claimant should identify the referees for each article because their qualifications, and the regularity with which they reviewed such articles, are issues upon which the Defendants' experts will need to comment. Furthermore, it will be necessary for the Defendants' counsel to cross-examine such reviewers as are being called by the Claimant as to why alleged faults or defects in those articles survived the relevant reviews.

  2. Secondly, further information is sought as to the place or places where CSF was administered between 2006 and 2008. This is relevant, first, to the issue of whether the Claimant has complied with his disclosure obligations. The Defendants' advisers are not in a position to judge whether a proportionate search has been carried out unless they are properly informed as to how many addresses and/or locations were involved. Secondly, the Defendants' proposed expert witnesses will need to know exactly how the CSF journal was run. This information should be provided.
It would therefore  seem to be getting more and more difficult for anyone to argue that TR's methodology has uncovered a pocket of excellence in Alexandria.

Unfortunately, it is beginning to look as though THE will not only use much the same method as last time but will apply normalisation to other indicators as well.
But what about the other performance indicators used to compare institutions? Our rankings examine the amount of research income a university attracts and the number of PhDs it awards. For 2011-12, they will also look at the number of papers a university has published that are co-authored by an international colleague.
Don’t subject factors come into play here, too? Shouldn’t these also be normalised? We think so. So I am pleased to confirm that for the 2011-12 World University Rankings, Times Higher Education will introduce subject normalisation to a range of other ranking indicators.
This is proving very challenging. It makes huge additional demands on the data analysts at Thomson Reuters and, of course, on the institutions themselves, which have had to provide more and richer data for the rankings project. But we are committed to constantly improving and refining our methodology, and these latest steps to normalise more indicators evidence our desire to provide the most comprehensive and rigorous tables we can.
What this might mean is that universities that spend modest amounts of money in fields where little money is usually spent would get a huge score. So what would happen if an eccentric millionaire left millions to establish a lavishly funded research chair in continental philosophy at Middlesex University?  There are no doubt precautions that Thomson Reuters could take but will they? The El Naschie business does not inspire very much confidence that they will.

The reception of the 2010 THE WUR rankings suggests that the many in the academic world have doubts about the wisdom of using normalised citation data without considering the potential for gaming or statistical anomalies. But the problem may run deeper and involve citations as such. QS, THE 's rival and former partner, have produced a series of subject rankings based on data from 2010. The overall results for each subject are based on varying combinations of the scores for academic opinion, employer opinion and citations per paper (not per faculty as in the general rankings).

The results are interesting. Looking at citations per paper alone we see that Boston College and Munich are jointly first in Sociology. Rutgers is third for politics and international studies. MIT is third for philosophy (presumably Chomsky and co). Stellenbosch is first for Geography and Area studies. Padua is first for linguistics. Tokyo Metropolitan University is second for biological sciences and Arizona State University first.


Pockets of excellence or statistical anomalies? These results may not be quite as incredible as Alexandria in the THE rankings but they are not a very good advertisement for the validity of citations as a measure of research excellence.

It appears that THE have not made their minds up yet. There is still time to produce a believable and rigorous ranking system. But whatever happens, it is unlikely that citations,  normalized or unnormalized, will continue to be the unquestionable gold standard of academic and scientific research.


    Saturday, July 09, 2011

    The Coming Ascendancy of China

    Matthew Reisz in Times Higher Education reports that Wen Jiabao, Prime Minister of China, has been awarded the King Charles II medal by the Royal Society for an ambitious national research program.

    "The scale and success of Chinese investment in research was reflected in findings released last month by Thomson Reuters - drawing on data collected for the Times Higher Education World University Rankings - which showed that the country's elite C9 League now generates more income per academic staff member than the UK's Russell Group.

    The top Chinese universities also award the highest number of doctoral degrees per academic.
    Despite a vast increase in output over the past decade, there has been no discernible dip in standards, and the quality of the research produced by Chinese universities has remained at about the world average."

    Some warnings are necessary. It is debatable whether the generation of research income is always a good indicator of quality. For one thing, note that the report talks about "per academic staff". Getting rid of or forgetting about "unproductive" departments like philosophy or languages could boost scores as easily as getting grants.

    Still, it seems likely that the Chinese are on the way to scientific supremacy, at least in the natural sciences. There are obstacles ahead such as centralised control that might one day slow down the growth of research. They might even  read  Times Higher and stop being so unpleasantly aggressive and competitive, but at the moment that seems unlikely.

    Incidentally, do the data collected for the THE World University Rankings tell us anything that we couldn't learn from the Shanghai rankings?

    Tuesday, July 05, 2011

    QS Subject Rankings for the Social Sciences

    QS have released their subject rankings for the social sciences based on data gathered during last year's rankings.

    The overall rankings are not surprising. Here are top three in each subject.

    Sociology
    1.  Harvard
    2.  UC Berkeley
    3.  Oxford

    Statistics and Operational Research
    1.  Stanford
    2.  Harvard
    3.  UC Berkeley

    Politics and International Studies
    1.  Harvard
    2.  Oxford
    3.  Cambridge

    Law
    1.  Harvard
    2.  Oxford
    3.  Cambridge

    Economics and Econometrics
    1.  Harvard
    2.  MIT
    3. Stanford

    Accounting and Finance
    1.  Harvard
    2.  Oxford
    3.  MIT

    The top three in the citations per paper indicator is, in most cases, rather different. Are these pockets of excellence or something else?

    Sociology
    1=  Boston College
    1=  Munich
    3.   Florida State University

    Statistics and Operational Research
    1.  Aarhus
    2.  Helsinki
    3.  Erasmus University Rotterdam

    Politics and International Studies
    1.  Yale
    2.  Oslo
    3.  Rutgers

    Law
    1.  Victoria University of Wellington
    2.  Koln
    3.  Munster

    Economics and Econometrics
    1.  Dartmouth
    2.  Harvard
    3.  Princeton

    Accounting and Finance
    1.  University of Pennsylvania
    2=  Harvard
    2=  North Carolina at Chapel Hill

    Friday, July 01, 2011

    Worth Reading

    Andrejs Rauhvargers,  Global University Rankings and their Impact (European University Association).
    Rauhvargers
    Interesting News

    U.S. News are getting ready to start ranking American online colleges.
    The THE Survey

    Times Higher Education and its partner Thomson Reuters have announced the completion of their survey of academic opinion. There were 17,554 responses from 137 countries, nearly a third more than last year. That means nearly 31,000 responses over the last two years but THE, in contrast to their rivals, QS, will only count responses to this year's survey.

    QS have still not closed their survey so it looks as though they might well be push the number of responses to their survey over 17,500 and claim victory. THE, no doubt, will point out that all of their respondents are new ones and that QS are counting respondents from 2010 and 2009.

    THE have indicated the number of responses but not the number of survey forms that were sent out. So, the response rate for the survey is still unknown. This is more important for judging the validity of the survey than just the number of responses.

    Sunday, June 19, 2011

    QS Latin American Rankings

    QS have published the results of their preliminary study for a Latin American university ranking. This would be the second in their series of regional rankings after the Asian rankings, now in the third year.

    The methodology suggested by the rankings is as follows:

    Latin American Academic Reputation  30%
    Papers per Faculty  10%
    Citations per Paper 10%
    Student Faculty Ratio 10%
    Staff with Ph D 10%
    Latin American Employer Reputation 20%
    International Faculty 2.5%
    International Students 2.5%
    Inbound Exchange Students 2.5%
    Outbound Exchange Students  2.5%

    QS's surveys have been criticised on several grounds, including low response rates. However, the employer survey is valuable as an external  assessment of universities, while the academic survey might be considered a complement to citations-based indicators which in both the THE and QS rankings have thrown up some odd results.

    There are two indicators that are directly research based. The apparent ease with which citations can be manipulated means that a variety of indicators could be used here, including citations per paper, h-index, total publications and citations, proportion of funded research and publications in high impact journals.QS have missed an opportunity here.

    Student faculty ratio is allocated 10% instead of 20 % as in the international ranking. This is an admittedly crude proxy for teaching quality. QS are apparently experimenting with a student satisfaction survey which might produce more valid results.

    Ten per cent goes to the proportion of staff with Ph Ds. This may well encourage the further and pointless over-production of substandard doctorates.

    Five per cent goes to international students and international faculty. I am not sure that this will mean very much especially in the smaller Central American republics. Counting exchange students is definitely not a good idea. This is something that can be easily manipulated. In the Asian rankings there were some large and puzzling increases in the numbers of exchange students between 2009 and 2010.

    Thursday, June 16, 2011

    The QS Arts and Humanities Rankings

    See here for the complete rankings.

    Here are the top five in each indicator, academic survey, employer survey, citations per paper of the QS subject rankings.

    There is nothing surprising about the leaders in the two surveys. But the citations indicator is another matter. Perhaps, QS has followed Times Higher in uncovering "clear pockets of excellence". Would any specialists out there like to comment on Newcastle University (the English one, not the Australian) and Durham as first for history -- something to do with proximity to Hadrian's Wall? What about Brown for Philosophy, Stellenbosch for Geography and Area Studies and Padua for linguistics?

    English Language and Literature
    Academic survey
    1.  Harvard
    2.  Oxford
    3.  Cambridge
    4.  UC Berkeley
    5.  Yale

    Employer Survey
    1.  Oxford
    2.  Cambridge
    3.  Harvard
    4.  MIT
    5.  UC Los Angeles

    No ranking for citations

    Modern Languages
    Academic Survey
    1.  Harvard
    2,  UC Berkeley
    3.  Oxford
    4.  Cambridge
    5.  Cornell

    Employer Survey
    1.  Harvard
    2.  Oxford
    3.  Cambridge
    4.  MIT
    5.  Stanford

    No rankings for citations

    History
    Academic Survey
    1.  Harvard
    2.  Cambridge
    3.  Oxford
    4.  Yale
    5.  UC Berkeley

    Employer Survey
    1. Oxford
    2.  Harvard
    3.  Cambridge
    4.  University of Pennsylvania
    5. Yale

    Citations per Paper
    1=  Newcastle (UK)
    1=  Durham
    3.   Liverpool
    4.   George Washington
    5.   University of Washington

    Philosophy
    Academic Survey
    1.  Oxford
    2.  Harvard
    3.  Cambridge
    4.  UC Berkeley
    5.  Princeton

    Employer Survey
    1.  Cambridge
    2.  Harvard
    3.  Oxford
    4.  MIT
    5.  UC Berkeley

    Citations per Paper
    1.  Brown
    2.  Melbourne
    3.  MIT
    4=  Rutgers
    4=  Zurich


    Geography and Area Studies
    Academic survey
    1.  UC Berkeley
    2.  Cambridge
    3.  Oxford
    4.  Harvard
    5.  Tokyo

    Employer Survey
    1.  Harvard
    2.  Cambridge
    3.  Oxford
    4.  MIT
    5.  UC Berkeley

    Citations per Paper
    1.  Stellenbosch
    2. Lancaster
    3.  Durham'
    4.  Queen Mary London
    5.  University of Kansas


    Linguistics
    Academic Survey
    1.  Cambridge
    2.  Oxford
    3.  Harvard
    4.  UC Berkeley
    5.  Stanford

    Employer Survey
    1.  Harvard
    2.  Oxford
    3.  MIT
    4.  UC Berkeley
    5.  Melbourne

    Citations per Paper
    1.  Padua
    2.  Boston University
    3.  York University (UK)
    4.  Princeton
    5.  Harvard

    Tuesday, May 31, 2011

    Asia: Japan Falling, Korea and China Rising


    See my article on the QS Asian Rankings 2011  in University World News


    Wednesday, May 18, 2011

    The QS Life Sciences Ranking Continued

    Looking at the scores for the three indicators, academic survey, employer survey and citations per paper, we find the situation is similar to that of the engineering rankings released last month. There is a reasonably high correlation between the scores for the two surveys:

    Medicine                     .720
    Biological Sciences      .747
    Psychology                  .570

    The correlations between the score for citations per paper and the academic survey are low but still significant:
    Medicine                          .290
    Biological Sciences           .177
    Psychology                       .217

    The correlations between the indicator citations and the employer survey are low or very low and insignificant:
    Medicine                               .129
    Biological Sciences                .015 
    Psychology                           -027


    Looking at the top five universities for each indicator, there are no surprises as far as the surveys are concerned but some of the universities in the top five for citations do cause some eyebrow raising. Arizona State university? University of Cinncinati? Tokyo Metropolitan University? Perhaps these are hitherto unnoticed pockets of excellence of the Alexandrian kind?

    Top Five in Medicine

    Academic Survey

    1.    Harvard
    2.    Cambridge
    3.    Oxford
    4.    Stanford
    5.    Yale

    Employer Survey

    1.     Harvard
    2.     Cambridge
    3.     Oxford
    4.     MIT
    5.     Stanford

    Citations per Paper 

    1.    MIT
    2.    Rockefeller University
    3.    Caltech
    4.    The University of Texas M. D. Anderson Cancer Center
    5.     Harvard


    Top Five in Biological Sciences

    Academic Survey

    1.    Cambridge
    2.    Harvard
    3.    UC Berkeley
    4.    Oxford
    5.    MIT

    Employer Survey


    1.  Harvard
    2.  Cambridge
    3.  MIT
    4.  Oxford
    5.  Stanford

    Citations per Paper

    1.  Arizona State university
    2.   Tokyo Metropolitan University
    3.   MIT
    4.   Rockefeller University
    5.   Harvard

    Top Five in Psychology

    Academic Survey

    1.    Harvard
    2.   Stanford
    3.    UC Berkeley
    4.    Cambridge
    5.    Oxford

    Employer Survey 

    1.     Cambridge
    2.     Harvard
    3.     Oxford
    4.     Stanford
    5.     UC Berkeley

    Citations per Paper

    1.     UC Irvine
    2.     Emory
    3.     Unuversity of Cinncinati
    4.     Princeton
    5.     Dartmouth College

    Friday, May 06, 2011

    Inappropriate Analogy Watch

    Times Higher Education of April 21st has a rather disconcerting cover, a close up picture of a bonobo ape. Inside there is a long article by a graduate student at the University of British Columbia that argues that humans may have been too hasty in assuming that their current aggressive behavior is rooted in their ancestry. He suggests that humanity is more closely related to the bonobos than to the common chimpanzees. The former are peaceful, promiscuous, egalitarian, dominated by females and without hang ups about homosexuality. They sound rather like a mix between a hippie commune and a humanities faculty at an American state university or least like those places would imagine themselves to be. Common chimpanzees on the other hand are notorious for behaving like a gang of skinheads on a Saturday night.

    This is a variant of a common theme in popularized social science writing. For a long time, western feminists and leftists have looked to contemporary or historical pre-modern societies for validation only to find disappointment. Margaret Mead’s free loving Samoans tuned out to be rather different while the search for mother earth worshipping matriarchies has been equally futile. Now, it seems they are forced to go back several million years. Perhaps the bonobo really are what primatologists say they are. But it would be unsurprising if they turn out to be  as politically incorrect, competitive and unpleasant as the chimpanzees.

    In any case, it is pseudo-science to suggest that humanity can take any other species as a model or inspiration . There are dozens of extinct species and subspecies between us and the bonobos who may have been even more gentle and promiscuous than the bonobo or even more violent and competitive than the chimpanzee.

     The point of the article is found in an editorial by Ann Mroz in the same issue.

    In higher education, we appear to moving from an approach based on cooperation to one based on competition, from the bonobo compact to the chimp reforms, if you like. The Browne Review launches us into a quasi-market world, which in itself has far-reaching implications. Unfortunately, it comes on top of a range of pre-existing and co-existing factors: the concentration of research funding; tighter immigration rules; cuts in teacher training and NHS cash; and internationalisation.

    Some post-1992 institutions facing immediate financial constraints are moving swiftly to deal with their problems. London Metropolitan University, for example, is cutting about 400 of its 557 degree courses, and the University of East London is planning to axe its School of Humanities and Social Sciences.

    Staff at the former institution describe the move as "an attempted reversal of widening participation...of everything that London Met...came into existence to promote". Staff at the latter describe its social sciences and humanities as high-performing areas. "Are UEL's non-traditional students going to be denied an academic education on the basis of managers' assumption that all such students are good for - and will be willing to pay for - is training?" they ask.

    She therefore concludes.

    UK universities have survived for 800 years through successful evolution in a relatively stable habitat, a context they share with the cooperative bonobo. The competitive chimpanzee, however, has had to adapt to more hostile conditions. In shaping the next stage of its evolution, the academy has the choice of emulating either the aggressive ape or the better angels of our nature.

    There is a problem with this. The bonobo are close to extinction. There are only 10,000 of them left, compared with 300,000 common chimpanzees and the only reason those 10,000 have survived is that they are separated by the Congo river from the chimpanzees.

    If Ann Mroz thinks British universities have evolved though cooperation over 800 years she should start  by reading the novels of C. P. Snow. No doubt they have become thoroughly cooperative over the last few years as diversity workshops, collaborative projects, performance appraisals, quality audits and professional development seminars have eradicted most signs of individuality in their faculty.

    But there is no Congo river separating British universities from all those nerds and buffs in Korea, China and Singapore who work 80 hours a week  and refuse to cooperate and are quite uninterested in diversity, safe and comfortable environments and collegiality.  

    And just what is so bad about training?