Showing posts sorted by relevance for query harvard. Sort by date Show all posts
Showing posts sorted by relevance for query harvard. Sort by date Show all posts

Tuesday, July 19, 2022

What's the Matter with Harvard?

When the first global ranking was published by Shanghai Jiao Tong University back in 2003, the top place was taken by Harvard. It was the same for the rankings that followed in 2004, Webometrics and the THES - QS World University Rankings.  Indeed, at that time any international ranking that did not put Harvard at the top would have  been regarded as faulty.

Is Harvard Declining?

But since then Harvard has been dethroned by a few rankings. Now MIT leads in the QS world rankings, while Oxford is first in THE's  and the Chinese Academy of Sciences in Nature Index. Recently Caltech deposed Harvard at the top of the Round University Rankings, now published in Georgia.

It is difficult to get excited about Oxford leading Harvard in the THE rankings. A table that purports to show Macau University of Science and Technology as the world's most international university, Asia University Taiwan as the most innovative, and An Najah National University as the best for research impact, need not be taken too seriously.

Losing out to MIT in the QS world rankings probably does not mean very much either. Harvard is at a serious disadvantage here for international students and international faculty.

Harvard and Leiden Ranking

On the other hand, the performance of Harvard in CWTS Leiden Ranking, which is generally respected in the global research community,  might tell us that something is going on. Take a look at the total number of publications for the period 2017-20 (using the default settings and parameters). There we can see Harvard at the top with 35,050 publications followed by Zhejiang and Shanghai Jiao Tong Universities.

But it is rather different for publications in the broad subject fields. Harvard is still in the lead for Biomedical Sciences and for Social Sciences and Humanities. For Mathematics and Computer Science, however, the top twenty consists entirely of Mainland Chinese universities. The best non - Mainland institution is Nanyang Technological University in Singapore. Harvard is 128th.

You could argue whether this is just a matter of quantity rather than quality. So, let's turn to another Leiden indicator, the percentage of publications in the top 10% of journals for Mathematics and Computer Science. Even here China is in the lead, although somewhat precariously. Changsha University of Science and Technology tops the table and  Harvard is in fifth place.

The pattern for Physical Sciences and Engineering is similar. The top 19 for publications are Chinese with the University of Tokyo in 20th place. However, for those in the top 10% Harvard still leads. It seems then that Harvard is still ahead for upmarket publications in physics and engineering but a growing and substantial amount of  research is done by China, a few other parts of Asia, and perhaps some American outposts of scientific excellence such as MIT and Caltech.

The Rise of China

The trend seems clear. China is heading towards industrial and scientific hegemony and eventually Peking, Tsinghua, Fudan and Zhejiang and a few others will, if nothing changes, surpass the Ivy league, the Group of Eight, and Oxbridge, although it will take longer for the more expensive and demanding fields of research. Perhaps the opportunity will be lost in the next few years if there is another proletarian cultural revolution in China or if Western universities change course.

What Happened to Harvard's Money?

It is standard to claim that the success or failure of universities is dependent on the amount of money they receive. The latest edition of the annual Nature Index tables was accompanied by headlines proclaiming that that China's recent success in high impact research was the result of a long term investment program. 

Money surely had a lot to do with it but there needs to be a bit of caution here. The higher education establishment has a clear vested interest in getting as much money from the public purse as it can and is inclined to claiming that any decline in the rankings is a result of hostility to higher education..

Tracing the causes of Harvard's decline, we should consult the latest edition of the Round University Rankings, now based in Georgia,  which provides ranks for 20 indicators. In 2021 Harvard was first but this year it was second, replaced by Caltech. So what happened?  Looking more closely we see that in 2021 Harvard was 2nd for financial sustainability and in 2022 it was 357th, That suggests a catastrophic financial collapse. So maybe there has been a financial disaster over at Harvard and the media simply have not noticed bankrupt professors jumping out of their offices, Nobel laureates hawking their medals, or mendicant students wandering the streets with tin cups. 

Zooming in a bit, it seems that, if the data is accurate, there has been a terrible collapse in Harvard's financial fortunes. For institutional income per academic staff Harvard's rank has gone from 21st to 891st.

Exiting sarcasm mode for a moment, it is of course impossible that there has actually been such a catastrophic fall in income. I suspect that what we have here is something similar to what happened  to Trinity College Dublin  a few years ago when someone forgot the last six zeros when filling out the form for the THE world rankings.

So let me borrow a flick knife from my good friend Occam and propose that what happened to Harvard in the Round University Rankings was simply that somebody left off the zeros at the end of the institutional income number when submitting data to Clarivate Analytics, who do the statistics for RUR. I expect next year the error will be corrected, perhaps without anybody admitting that anything was wrong.

So, there was no substantial reason why Harvard lost ground to Caltech in the Round Rankings this year. Still it does say something that such a mistake could occur and that nobody in the administration noticed or had the honesty to say anything. That is perhaps symptomatic of deeper problems within American academia. We can then expect the relative decline of Harvard and the rise of Chinese universities and a few others in Asia to continue.





Friday, February 22, 2008

More on Cambridge and Harvard

Alejandro Pisanty has an interesting comment on the previous post. I will reproduce a large part of here.

“In particular for Harvard it's darn tricky. "Harvard University" will
yield only a fraction of the production and the citations from there.
There's also Harvard Medical School, Harvard Business School, Harvard
Law School, etc., and nifty arrangements like Harvard-Smisthsonian
Astronomy Project (also in Cambridge MA; surely a half-floor of a physics or
astronomy unit in the best of cases) and so on.

QS and THES admit quite cynically that they don't really know too well
how to treat "children institutions". One can be sure that officials from
Harvard and Cambridge, and all British universities, have been well on
top of this by constant contact with QS and their staff. And, it all
happens in English.

One would reasonably excpect that QS does not apply the same care to
Malaysian or Mexican universities...”


The number of papers produced by the Harvard Business and Law Schools is relatively small although still a lot more than the Judge School of Business at Cambridge or Addenbrookes Hospital. Harvard Medical School, however, does produce a massive number of papers, over 35,000 according to Scopus between 2002 and 2006. Compare this with 12, 736 for "Harvard University" over the same period.

If QS did indeed count the papers produced by authors with a Harvard Medical School affiliation this would be an adequate -- probably more than adequate -- explanation for Harvard’s superiority over Cambridge in terms of citations. But another problem now arises. The number of citations per faculty would now be much larger than Caltech which does a bit better than Harvard in the THES-QS citations per faculty section.

It is possible that QS included the papers produced by HMS and then also counted " about [sic}10,674 medical school faculty". Not to do so would be absurd since any other procedure would mean that linguists, sociologists and engineers were getting credit for producing medical research.

But if QS counted papers with a Harvard Medical School affiliation and also counted all the medical faculty then we would be back where we started.

It still seems to me that the most plausible reconstruction of Harvard’s citations per faculty score is that QS did not count papers produced by the various schools, or least not by the Harvard Medical School, and that for the faculty figure they used the number given on the Harvard website or in QS’s school profile.

All this speculation would be unnecessary if QS told us exactly what they did but I wouldn’t bother waiting for that to happen.

Monday, December 24, 2007

Cambridge and Harvard

The THES-QS rankings can be viewed as a collection of complex interweaving narratives. There is the rise of China and its diaspora, the successful response of Australian universities to financial crisis, the brave attempts of Africa, spearheaded by the University of Cape Town, to break into the top 200.

The most interesting narrative is that of British universities -- Oxford, Cambridge and Imperial and University Colleges, London -- steadily coming closer to Harvard and pulling ahead of Princeton, Caltech and the rest.

This particular narrative requires rather more suspension of disbelief than most. By all accounts, including the Shanghai rankings and THES’s own count of citations per faculty, the research record of Cambridge and Oxford has been less than spectacular for several years.

Until this year Cambridge’s apparent near equality with Harvard was largely the result of its performance on QS’s survey of academic opinion, the so-called peer review. Since this has such an astonishingly low response rate, since it is noticeably biased against the US, since its relationship with research proficiency measured by citations per faculty or per paper is very limited, it should not be taken seriously.

This year methodological changes mean that the differences between Cambridge and Harvard on most measures are virtually obliterated. Both universities get 100 or 99 for the “peer review”, employer review and student faculty ratio. Both get 91 for international students.

Harvard stays ahead of Cambridge because of a much better performance on citations per faculty. I thought it might be interesting to see how this margin was achieved.

QS is now using the Scopus database for which a 30-day free trial is available. THES states that the consultants counted the number of citations of papers published between 2002 and 2006 and then divided the total by the number of faculty. I have tried to reproduce QS's scores for Cambridge and Harvard

First, here is the number of papers published by authors with an affiliation to “Cambridge University” between 2002 and 2006 and the number of citations of those papers. The number of documents in the Scopus database is increasing all the time so a count done today would yield different results. These numbers are from two weeks ago.

CAMBRIDGE (“Cambridge University”) 2002-2006


Life sciences

Documents 7,614

Citations 116,875

Health sciences

Documents 4,406

Citations 65,211

Physical sciences

Documents 11,514

Citations 100,225

Social sciences

Documents 2,636

Citations 24,292


Total

Documents 26,170

Citations 306,603

Using the FTE faculty figure of 3,765 provided by QS on their website, we have 83 citations per faculty.

I noticed that a number of authors gave their affiliation as “University of Cambridge”. This added 26,710 citations to make a total of 333,313 citations and 89 citations per faculty.

Now for Harvard. Searching the Scopus database reveals the following totals of papers and citations for “Harvard University”.

HARVARD ("Harvard University") 2002-2006

Life sciences

Documents 4,003

Citations 79,663


Health science

Documents 2,577

Citations 47,486

Physical Science

Documents 6,429

Citations 91,154

Social Science

Documents 3,686

Citations 48,844

Total

Documents 16,695

Citations 267,147

I suspect that most observers would consider Cambridge's superiority to Harvard in number of publications and citations indicative more of the bias of the database than anything else.


If we use QS’s faculty headcount figure for Harvard of 3,389 and assume that 8 per cent of these are part-timers with a quarter-time teaching load then we have 3,167 FTE faculty. This would give us 84 citations per faculty, slightly better than Cambridge if citations of “University of Cambridge " publications are excluded and somewhat worse if they are included.


The problem is, though , that QS give Harvard a score of 96 for citations per faculty and Cambridge a score of 83. The only plausible way I can think of for Harvard to do so much better when they have fewer citations is that a smaller faculty figure was used to calculate the citations per faculty number for Harvard than was used to calculate the student faculty ratio. The Harvard web site refers to "about [sic] 2,497 non-medical faculty" and in QS’s school profile of Harvard there is a reference to "more than 2,000 faculty". I suspect that this number was used to calculate the citations per faculty score while the larger number was used to calculate the student faculty ratio. Had the former been used for both criteria, than Cambridge and Harvard would have been virtually equal for citations and Cambridge would have moved into the lead by virtue of a better international faculty score.

The may be some other explanation . If so , then I would be glad to hear it.

If this is what happened then it would be interesting to know whether there was simply another run of the mill error with that ubiquitous junior staff member using two different faculty figures to calculate the two components or a cynical ploy to prevent Cambridge moving into the lead too early.


Thursday, September 25, 2014

How the Universities of Huddersfield, East London, Plymouth, Salford, Central Lancashire et cetera helped Cambridge overtake Harvard in the QS rankings

It is a cause of pride for the great and the good of British higher education that the country's universities  do brilliantly in certain global rankings. Sometimes though, there is puzzlement about how UK universities can do so well even though the  performance of the national economy  and the level of adult cognitive skills are so mediocre.

In the latest QS World University Rankings Cambridge and Imperial College London pulled off a spectacular feat when they moved ahead of Harvard into joint second place behind MIT, an achievement at first glance as remarkable as Leicester City beating Manchester United. Is this a tribute to the outstanding quality of teaching, inspired leadership or cutting edge research, or perhaps something else?

Neither Cambridge nor Imperial does very well in the research based rankings. Cambridge is 18th and Imperial 26th among higher education institutions in the latest Scimago rankings for output and 32nd and 33rd for normalised impact (citations per paper adjusted for field). Harvard is 1st and 4th for these indicators. In the CWTS Leiden Ranking, Cambridge is 22nd and Imperial 32nd for the mean normalised citation score, sometimes regarded as the flagship of these rankings, while Harvard is 6th.

It is true that Cambridge does much better on the Shanghai Academic Ranking of World Universities with fifth place overall, but that is in large measure due to an excellent score, 96.6, for alumni winning Nobel and Fields awards, some dating back several decades. For Highly Cited Researchers and publications in Nature and Science its performance is not nearly so good.

Looking at the THE World University Rankings, which make some attempt to measure factors other than research, Cambridge and Imperial come in 7th and 10th overall, which is much better than they do in the Leiden and Scimago rankings. However, it is very likely that the postgraduate teaching and research surveys made a significant contribution to this performance. Cambridge is 4th in the THE reputation rankings based on last year's data and Imperial is 13th.

Reputation is also a key to the success of Cambridge and Imperial in the QS world rankings. Take a look at the scores and positions of Harvard, Cambridge and Imperial in the rankings just released.

Harvard  gets 100 points (2nd place) for the academic survey, employer survey (3rd), and citations per faculty (3rd). It has 99.7 for faculty student ratio (29th), 98.1 for international faculty (53rd), and 83.8 for international students (117th). Harvard's big weakness is its relatively small percentage of international students.

Cambridge is in first place for the academic survey and 2nd in the employer survey, in both cases with a score of 100 and one place ahead of Harvard. The first secret of Cambridge's success is that it does much better on reputational measures than for bibliometric or other objective data. It was 18th for faculty student ratio, 73rd for international faculty, 50th for international students and 40th for citations per faculty.

So, Cambridge is ahead for faculty student ratio and international students and Harvard is ahead for international faculty and citations per faculty. Both get 100 for the two surveys.

Similarly, Imperial has 99.9 points for the academic survey (14th), 100 for the employer survey (7th), 99.8 for faculty student ratio (26th), 100 for international faculty (41st), 99.7 (20th) for international students and 96.2 (49th) for citations per faculty. It is behind Harvard for citations per faculty but just enough ahead for international students to squeeze past into joint second place.

The second secret is that QS's standardisation procedure combined with an expanding database means that the scores of the leading universities in the rankings are getting more and more squashed together at the top. QS turns its raw data into Z scores so that universities are measured according to their distance in standard deviations from the mean for all ranked universities. If the number of sub-elite universities in the rankings increases then the overall means for the indicators will fall and the scores of universities at the top end will rise as their distance in standard deviations from the mean increases.

Universities with scores of 98 and 99 will now start getting scores of 100. Universities with recorded scores of 100 will go on getting 100, although they might go up up a few invisible decimal points

In 2008, QS ranked 617 universities. In that year, nine universities had a score of 100 for the academic survey, four for the employer survey, nine for faculty student ratio, six for international faculty, six for international students and seven for citations per faculty.

By 2014 QS was ranking over 830 universities (I assume that those at the end of the rankings marked "NA" are there because they got votes in the surveys but are not ranked because they fail to meet the criteria for inclusion). For each indicator the number of universities getting a score of 100 increased. In 2014 there were 13 universities with a score of 100 for the academic survey, 14 for the employer survey, 16 for faculty student ratio, 41 for international faculty, 15 for international students and 10 for citations per faculty,

In 2008 Harvard got the same score as Cambridge for the academic and employer surveys. It was 0.3 (0.06 weighted) behind for faculty student ratio, 0.6 (0.53 weighted) behind for international faculty, and 14.1 (0.705 weighted) behind for international students, It was, however, 11.5 points. (2.3 weighted) ahead for citations per faculty. Harvard was therefore first and Cambridge third.

By 2014 Cambridge had fallen slightly behind Harvard for international faculty. It was slightly ahead for faculty student ratio. Scores for the survey remained the same, 100 for both places. Harvard reduced the gap for international students slightly.

What made the difference in 2014 and put Cambridge ahead of Harvard was that in 2008 Harvard  in fifth place for citations and with a score 100 was 11.5 (2.3 weighted) points ahead of Cambridge. In 2014 Cambridge had improved a bit for this indicator -- it was 40th instead of 49th -- but now got 97.9 points reducing the difference with Harvard to 2.1 points (0.42 weighted). That was just enough to let Cambridge overtake Harvard.

Cambridge's rise between 2008 and 2014 was thus largely due to the increasing number of ranked universities which led to lower means for  each indicator which led to higher Z scores at the top of each indicator and so reduced the effect of Cambridge's comparatively lower citations per faculty score.

The same thing happened to Imperial . It did a bit better for citations, rising from 58th to 49th place and this brought it a rise in points from 83.10 to 96.20 again allowing it to creep past Harvard.

Cambridge and Harvard should be grateful to those universities filling up the 701+ category at the bottom of the QS rankings. They are the invisible trampoline that propelled "Impbridge" into second place, just behind MIT.

QS should think carefully about adding more universities to their rankings. Another couple of hundred and there will be a dozen universities at the top getting 100 for everything.









Thursday, June 16, 2011

The QS Arts and Humanities Rankings

See here for the complete rankings.

Here are the top five in each indicator, academic survey, employer survey, citations per paper of the QS subject rankings.

There is nothing surprising about the leaders in the two surveys. But the citations indicator is another matter. Perhaps, QS has followed Times Higher in uncovering "clear pockets of excellence". Would any specialists out there like to comment on Newcastle University (the English one, not the Australian) and Durham as first for history -- something to do with proximity to Hadrian's Wall? What about Brown for Philosophy, Stellenbosch for Geography and Area Studies and Padua for linguistics?

English Language and Literature
Academic survey
1.  Harvard
2.  Oxford
3.  Cambridge
4.  UC Berkeley
5.  Yale

Employer Survey
1.  Oxford
2.  Cambridge
3.  Harvard
4.  MIT
5.  UC Los Angeles

No ranking for citations

Modern Languages
Academic Survey
1.  Harvard
2,  UC Berkeley
3.  Oxford
4.  Cambridge
5.  Cornell

Employer Survey
1.  Harvard
2.  Oxford
3.  Cambridge
4.  MIT
5.  Stanford

No rankings for citations

History
Academic Survey
1.  Harvard
2.  Cambridge
3.  Oxford
4.  Yale
5.  UC Berkeley

Employer Survey
1. Oxford
2.  Harvard
3.  Cambridge
4.  University of Pennsylvania
5. Yale

Citations per Paper
1=  Newcastle (UK)
1=  Durham
3.   Liverpool
4.   George Washington
5.   University of Washington

Philosophy
Academic Survey
1.  Oxford
2.  Harvard
3.  Cambridge
4.  UC Berkeley
5.  Princeton

Employer Survey
1.  Cambridge
2.  Harvard
3.  Oxford
4.  MIT
5.  UC Berkeley

Citations per Paper
1.  Brown
2.  Melbourne
3.  MIT
4=  Rutgers
4=  Zurich


Geography and Area Studies
Academic survey
1.  UC Berkeley
2.  Cambridge
3.  Oxford
4.  Harvard
5.  Tokyo

Employer Survey
1.  Harvard
2.  Cambridge
3.  Oxford
4.  MIT
5.  UC Berkeley

Citations per Paper
1.  Stellenbosch
2. Lancaster
3.  Durham'
4.  Queen Mary London
5.  University of Kansas


Linguistics
Academic Survey
1.  Cambridge
2.  Oxford
3.  Harvard
4.  UC Berkeley
5.  Stanford

Employer Survey
1.  Harvard
2.  Oxford
3.  MIT
4.  UC Berkeley
5.  Melbourne

Citations per Paper
1.  Padua
2.  Boston University
3.  York University (UK)
4.  Princeton
5.  Harvard

Sunday, August 14, 2011

Press release from Shanghai

Here is the press release from Shanghai Jiao Tong University giving more details about this year's rankings.

Monday, August 15, 2011
Shanghai, People's Republic of China
The Center for World-Class Universities of Shanghai Jiao Tong University released today the 2011 Academic Ranking of World Universities (ARWU), marking its 9th consecutive year of measuring the performance of top universities worldwide.
Harvard University tops the 2011 list; other Top 10 universities are: Stanford, MIT, Berkeley, Cambridge, Caltech, Princeton, Columbia, Chicago and Oxford. In Continental Europe, ETH Zurich (23rd) in Switzerland takes first place, followed by Paris-Sud (40th) and Pierre and Marie Curie (41st) in France. The best ranked universities in Asia are University of Tokyo (21st) and Kyoto University (24th) in Japan.
Three universities are ranked among Top 100 for the first time in the history of ARWU: University of Geneva (73rd), University of Queensland (88th) and University of Frankfurt (100th). As a result, the number of Top 100 universities in Switzerland, Australia and Germany increases to 4, 4 and 6 respectively.
Ten universities first enter into Top 500, among them University of Malaya in Malaysia and University of Zagreb in Croatia enable their home countries to be represented, together with other 40 countries, in the 2011 ARWU list.
Progress of universities in Middle East countries is remarkable. King Saud University in Saudi Arabia first appears in Top 300; King Fahd University of Petroleum & Minerals in Saudi Arabia, Istanbul University in Turkey and University of Teheran in Iran move up in Top 400 for the first time; Cairo University in Egypt is back to Top 500 after five years of staggering outside.
The number of Chinese universities in Top 500 increases to 35 in 2011, with National Taiwan University, Chinese University of Hong Kong, and Tsinghua University ranked among Top 200.
The Center for World-Class Universities of Shanghai Jiao Tong University also released the 2011 Academic Ranking of World Universities by Broad Subject Fields (ARWU-FIELD) and 2011 Academic Ranking of World Universities by Subject Field (ARWU-SUBJECT).Top 100 universities in five broad subject fields and in five selected subject fields are listed, where the best five universities are:
Natural Sciences and Mathematics – Harvard, Berkeley, Princeton, Caltech and Cambridge
Engineering/Technology and Computer Sciences – MIT, Stanford, Berkeley, UIUC and Georgia Tech
Life and Agriculture Sciences – Harvard, MIT, UC San Francisco, Cambridge and Washington (Seattle)
Clinical Medicine and Pharmacy – Harvard, UC San Francisco, Washington (Seattle), Johns Hopkins and Columbia
Social Sciences – Harvard, Chicago, MIT, Berkeley and Columbia
Mathematics – Princeton, Harvard, Berkeley, Stanford and Cambridge
Physics – MIT, Harvard, Caltech,Princeton and Berkeley
Chemistry – Harvard, Berkeley, Stanford, Cambridge and ETH Zurich
Computer Science – Stanford, MIT, Berkeley, Princeton and Harvard
Economics/Business – Harvard, Chicago, MIT, Berkeley and Columbia
The complete listsand detailed methodologies can be found at the Academic Ranking of World Universities website at http://www.ShanghaiRanking.com/.
Academic Ranking of World Universities (ARWU): Starting from 2003, ARWU has been presenting the world top 500 universities annually based on a set of objective indicators and third-party data. ARWU has been recognized as the precursor of global university rankings and the most trustworthy list. ARWU uses six objective indicators to rank world universities, including the number of alumni and staff winning Nobel Prizes and Fields Medals, number of highly cited researchers selected by Thomson Scientific, number of articles published in journals of Nature and Science, number of articles indexed in Science Citation Index - Expanded and Social Sciences Citation Index, and per capita performance with respect to the size of an institution. More than 1000 universities are actually ranked by ARWU every year and the best 500 are published.
Center for World-Class Universities of Shanghai Jiao Tong University (CWCU): CWCU has been focusing on the study of world-class universities for many years, published the first Chinese-language book titled world-class universities and co-published the first English book titled world-class universities with European Centre for Higher Education of UNESCO. CWCU initiated the "International Conference on World-Class Universities" in 2005 and organizes the conference every second year, which attracts a large number of participants from all major countries. CWCU endeavors to build databases of major research universities in the world and clearinghouse of literature on world-class universities, and provide consultation for governments and universities.
Contact: Dr.Ying CHENG at ShanghaiRanking@gmail.com

Tuesday, January 25, 2011

Cambridge and Harvard

After the break with THE, QS decided to continue the old methodology of the 2004-2009 rankings. At least, that is what they said. It was therefore surprising to see that, according to data provided by QS,  there were in fact a number of noticeable rises and falls between 2009 and 2010 although nothing like as much as in previous years.


For example the University of Munich fell from 66th place to 98th place, the Free University of Berlin from 70th to 94th and Stockholm University from 168th to 215th while University College Dublin rose from 114th to 89th and Wurzburg from 309th to 215th.

But perhaps the most remarkable news was that Cambridge replaced Harvard as the world's best university. In every other ranking Harvard is well ahead.

So how did it happen? According to Martin Ince, “Harvard has taken more students since the last rankings were compiled without an equivalent increase in the number of academics.”

In other words there should have been a lower faculty student ratio and therefore a lower score for this indicator. This in fact happened. Harvard’s score went from 98 to 97.

Ince also says that there was an “improvement in staffing levels”, at Cambridge, presumably meaning that there was an increase in the number of faculty relative to the number of students. Between 2009 and 2010 Cambridge’s score for the student faculty remained the same at 100 which is consistent with Ince’s claim.

In addition to this, there was a "significant growth in the number of citations per faculty member" for Cambridge. It is not impossible that the number of citations racked up by Cambridge had  risen relative to Harvard but the QS indicator counts citations over a five year so even a substantial increase in publications or citations would take a few years to have an equivalent effect on this indicator. Also note that this indicator is citations per faculty and it appears that the number of faculty at Cambridge has gone up relative to Harvard. So we would expect any increase in citations to be cancelled out by a similar increase in faculty.

It looks a little odd then that for this indicator the Cambridge score rose from 89 to 93, four points, which is worth 0.8 in the weighted total score. That, by the way, was the difference between Harvard and Cambridge in 2009.

The oddity is compounded when we look at other high ranking universities. between 2009 and 2010 Leiden's score for citations per faculty rose from 97 to 99, Emory from 90 to 95, Oxford from 80 to 84, Florida from 70 to 75.

It would at first sight appear plausible that if Harvard, the top scorer in both years, did worse on this indicator then everybody or nearly everybody else would do better. But if we look at universities further down the table, we found the opposite. Between 2009 and 2010 for this indicator  Bochum fell from 43 to 34, Ghent from 43 to 37, Belfast from 44 to 35 and so on.

Could it be that there was some subtle and unannounced change in the method by which the raw scores were transformed into  indicator scores. Is it just a coincidence that the change was sufficient to erase the difference between Harvard and Cambridge?








http://www.wiziq.com/tutorial/90743-QS-World-University_Rankings-top-500

Sunday, October 09, 2011

Caltech in First Place

The big news of the 2011 THE - TR rankings is that Caltech has replaced Harvard as the world's top university. So how exactly did they do it?

According to the Times Higher iPad apps for this year and last (easily downloadable from the rankings page), Harvard's total score fell from 96.1 to 93.9 and Caltech's from 96.0 to 94.8, turning a 0.1 Harvard lead into one of 0.9 for Caltech.

Harvard continued to do better than Caltech in two indicators, with 95 .8 for teaching and 67.5 for international orientation compared to 95.7 and 56.0 for Caltech.

Caltech is much better than Harvard in industry income - innovation but that indicator has a weighting of only 2.5 %.

Harvard's slight lead in the research indicator has turned into a slight lead of 0.8 for Caltech.

Caltech is still ahead for citations but Harvard caught up a bit, narrowing the lead to 0.1.

So, it seems that what made the difference was the research indicator. it seems unlikely that Caltech could overcome Harvard's massive lead in reputation for research and postgraduate teaching: last year it was 100 compared with 23.5. That leaves us with research income per faculty.
 
According to Phil Baty :

"Harvard reported funding increases that are similar in proportion to those of many other universities, whereas Caltech reported a steep rise (16 per cent) in research funding and an increase in totalinstitutional income."

This seems generally compatible with Caltech's 2008-2009 financial statement according to which:

Before accounting for investment losses, total unrestricted revenues increased 6.7% including JPL, and 14.0% excluding JPL

and

Research awards in FY 2009 reached an all-time high of $357 million, including $29 million of funds secured from the federal stimulus package. Awards from federal sponsors increased by 34.4%, while awards from nonfederal sponsors increased by 20.7%.  We also had a good year in terms of private giving, as donors continue to recognize the importance of the research and educational efforts of our outstanding faculty and students.

It seems that research income is going to be the tie-breaker at the top of the THE - TR rankings.  This might not be such a good thing. Income is an input. It is not a product, although universities everywhere apparently think so. There are negative backwash effects coming if academics devote their energies to securing grants rather than actually doing research.

Sunday, November 10, 2019

When will Tsinghua Overtake Harvard?

One of the most interesting trends in higher education over the last few years is the rise of China and the relative decline of the USA.

Winston Churchill said the empires of the future will be empires of the mind. If that is so then this century will very likely be the age of Chinese hegemony. Chinese science is advancing faster than that of the USA on all or nearly all fronts. Unless we count things like critical race theory or queer studies.

This is something that should show up in the global rankings if we track them over at least a few years. So, here is a comparison of the top two universities in the two countries according to indicators of research output and research quality over a decade.

Unfortunately, most international rankings are not very helpful in this respect. Few of the current ones provide data for a decade or more. QS and THE have seen frequent changes in methodology and THE's citation indicator although charmingly amusing is not useful unless you think that Aswan University, Anglia Ruskin University and the University of Peradeniya are world beaters for research impact. Two helpful rankings here are Shanghai Academic Ranking of World Universities (ARWU), and Leiden Ranking.

Let's compare the comparative performance of Tsinghua University and Harvard in the Shanghai Ranking's indicator of research output, papers over a one year period, excluding arts and humanities. The published scores are derived from the square roots of the raw data with the top scorer getting a score of 100.

In 2009 Harvard's score  was 100 while that for Tsinghua was 55.8. In 2019 it was 100 for Harvard and 79.5 for Tsinghua. So the gap is closing 2.37 points every year. At that rate it would take about nine years for Tsinghua to catch up so look out for 2028.

Of course, this is quantity not quality so take a look at another indicator, Highly Cited Researchers. This is a moderately gamable metric and I suspect that Shanghai might have to abandon it one day but it captures the willingness and ability of universities to sponsor research of a high quality. In 2009 Tsinghua's score was zero compared to Harvard's 100. In 2019 it is 37.4. If everything continues at the same rate Tsinghua will overtake Harvard in another 17 years.

Looking at the default indicator in Leiden Ranking, total publications, in 2007-10 Tsinghua was 35% of Harvard and in 2014-17 56%. Working from that Tsinghua would achieve parity in 2029-33, in the rankings published in 2035.

Looking at a measure of research quality, publications in the top 10% of journals, Tsinghua was 15% of Harvard in 2007-10 and 34% in 2014-17.  From that, Tsinghua should reach parity  in 2038-42. in the rankings published in 2044, assuming Leiden is still following its current methodology.

So it looks like Tsinghua will reach parity in  research output in a decade or a decade or a decade and a half and high quality research in a decade and a half or two decades and a half.






Tuesday, July 05, 2011

QS Subject Rankings for the Social Sciences

QS have released their subject rankings for the social sciences based on data gathered during last year's rankings.

The overall rankings are not surprising. Here are top three in each subject.

Sociology
1.  Harvard
2.  UC Berkeley
3.  Oxford

Statistics and Operational Research
1.  Stanford
2.  Harvard
3.  UC Berkeley

Politics and International Studies
1.  Harvard
2.  Oxford
3.  Cambridge

Law
1.  Harvard
2.  Oxford
3.  Cambridge

Economics and Econometrics
1.  Harvard
2.  MIT
3. Stanford

Accounting and Finance
1.  Harvard
2.  Oxford
3.  MIT

The top three in the citations per paper indicator is, in most cases, rather different. Are these pockets of excellence or something else?

Sociology
1=  Boston College
1=  Munich
3.   Florida State University

Statistics and Operational Research
1.  Aarhus
2.  Helsinki
3.  Erasmus University Rotterdam

Politics and International Studies
1.  Yale
2.  Oslo
3.  Rutgers

Law
1.  Victoria University of Wellington
2.  Koln
3.  Munster

Economics and Econometrics
1.  Dartmouth
2.  Harvard
3.  Princeton

Accounting and Finance
1.  University of Pennsylvania
2=  Harvard
2=  North Carolina at Chapel Hill

Wednesday, May 18, 2011

The QS Life Sciences Ranking Continued

Looking at the scores for the three indicators, academic survey, employer survey and citations per paper, we find the situation is similar to that of the engineering rankings released last month. There is a reasonably high correlation between the scores for the two surveys:

Medicine                     .720
Biological Sciences      .747
Psychology                  .570

The correlations between the score for citations per paper and the academic survey are low but still significant:
Medicine                          .290
Biological Sciences           .177
Psychology                       .217

The correlations between the indicator citations and the employer survey are low or very low and insignificant:
Medicine                               .129
Biological Sciences                .015 
Psychology                           -027


Looking at the top five universities for each indicator, there are no surprises as far as the surveys are concerned but some of the universities in the top five for citations do cause some eyebrow raising. Arizona State university? University of Cinncinati? Tokyo Metropolitan University? Perhaps these are hitherto unnoticed pockets of excellence of the Alexandrian kind?

Top Five in Medicine

Academic Survey

1.    Harvard
2.    Cambridge
3.    Oxford
4.    Stanford
5.    Yale

Employer Survey

1.     Harvard
2.     Cambridge
3.     Oxford
4.     MIT
5.     Stanford

Citations per Paper 

1.    MIT
2.    Rockefeller University
3.    Caltech
4.    The University of Texas M. D. Anderson Cancer Center
5.     Harvard


Top Five in Biological Sciences

Academic Survey

1.    Cambridge
2.    Harvard
3.    UC Berkeley
4.    Oxford
5.    MIT

Employer Survey


1.  Harvard
2.  Cambridge
3.  MIT
4.  Oxford
5.  Stanford

Citations per Paper

1.  Arizona State university
2.   Tokyo Metropolitan University
3.   MIT
4.   Rockefeller University
5.   Harvard

Top Five in Psychology

Academic Survey

1.    Harvard
2.   Stanford
3.    UC Berkeley
4.    Cambridge
5.    Oxford

Employer Survey 

1.     Cambridge
2.     Harvard
3.     Oxford
4.     Stanford
5.     UC Berkeley

Citations per Paper

1.     UC Irvine
2.     Emory
3.     Unuversity of Cinncinati
4.     Princeton
5.     Dartmouth College

Tuesday, September 06, 2011

The Best University in the World
Update 8/9/2011 -- some comments added

For many people the most interesting thing about the QS rankings is the battle for the top place. The Shanghai rankings put Harvard in first place year after year and no doubt will do so for the next few decades. QS when it was in partnership with Times Higher Education also routinely put Harvard first. This is scarcely surprising since the research prowess of Cambridge has steadily declined in recent years. Still, Cambridge, Oxford and two London colleges did quite well mainly because they got high scores for international faculty and students and for the academic survey (not surprising since a disproportionate number of responses came from the UK, Australia and New Zealand) but not well enough to get over their not very distinguished research record.

Last year, however, Cambridge squeezed past Harvard. This was not because of the  academic and employer surveys. That remained at 100 for both places. What happened was that between 2009 and 2010 Cambridge's score for citations per faculty increased from 89 to 93. This would be a fine achievement if it represented a real improvement. Unfortunately, almost every university with scores above 60 for this indicator in 2009 went up by a similar margin in 2010 while universities with scores below 50 slumped. Evidently, there was a new method of converting raw scores. Perhaps a mathematician out there can help.

And this year?

Cambridge and Harvard are both at 100 for the academic and employer surveys just like last year. (Note that although Harvard does better than Cambridge in both surveys they get the same reported score of 100).


For the faculty student ratio Harvard narrowed the gap a little from 3 to 2.5 points. In citations per faculty Cambridge slipped a bit by 0.3 points. However, Cambridge pulled further ahead on international students and faculty.

Basically, from 2004 to 2009 Harvard reigned supreme because its obvious superiority in research was more than enough to offset the advantages Cambridge enjoyed with regard to internationalisation (small country and policies favouring international students), faculty student ratio (counting non-teaching research staff) and the academic survey (disproportionate responses from the UK and Commonwealth). But this year and last the change in the method of converting the raw scores for citations per faculty artificially boosted Cambridge's overall scores.

So, is Cambridge really the world's top university?

Wednesday, September 08, 2010

Cambridge Beats Harvard -- Sort of

The big news from the QS World University Rankings today is that Cambridge is finally top after trailing Harvard for six years.

This seems a little odd since Cambridge is way behind Harvard, and a few other places, on all the indicators in the Shanghai rankings. So what happened? Looking at the indicator scores we find that on the "Academic Peer Review" -- more accurately called an Academic Reputation Index elsewhere on the site -- Cambridge is first and Harvard second. For the Employer Review Cambridge is third and Harvard first, reversing their places last year. For citations per faculty Harvard was third and Cambridge 36th, behind Tufts, Emory and UC Santa Cruz among others. For student faculty ratio, Cambridge was 18th and Harvard 40th. At the time of writing data was not available for International Faculty and Students.

It seems that the main factor in Cambridge's success was the academic survey. QS indicates the sources of the survey.
  • 1,648 previous respondents who returned. If QS have continued the practice of previous years , they also counted respondents from 2009 and 2008 even if they did not submit a form.
  • 180,00 out of 300,000 persons on the mailing list of World Scientific, a Singapore-based publishing company with links to Imperial College London. World Scientific, by the way, claim to have 400,000 subscribers.
  • 48,125 records from Mardev-DM2
  • 2,000 academics who signed up at the QS site
  • Lists provided by institutions. In 2010 160 universities provided more than 40,000 names.

I will let readers decide how representative or accurate such a survey can be.

Incidentally, QS should be given credit for the detailed description of the methodology of this criterion.

Thursday, February 06, 2014

The Best Universities for Research

It seems to be the time of year when there a slow trickle of university ranking spin-offs before the big three world rankings starting in August. We have had young university rankings, best student cities, most international universities, BRICS rankings.

Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.

First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.

1.   Harvard
2.   Tokyo
3.   Toronto
4.   Tsinghua
5.   Sao Paulo
6.   Michigan Ann Arbor
7.   Johns Hopkins
8.   UCLA
9.   Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia

Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.

1.   MIT
2.   Harvard
3.   University of California San Francisco
4=  Stanford
4=  Princeton
6.   Duke
7.   Rice
8.   Chicago
9=  Columbia
9=  University of California Berkeley
9=  University of California Santa Cruz
12.  University Of California Santa Barbara
13.  Boston University
14= Johns Hopkins
14= University of Pennsylvania
16.  University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20.  Oxford

The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.

1.   Weizmann Institute of Technology
2.   Caltech
3.   Rockefeller University
4.   Harvard
5.   Stanford
6.   Gwanju Institute of Science and Technology
7.   UCLA
8.   University of California San Francisco
9.   Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell

The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.

1=  MIT
1=  Tokyo Metropolitan University
3=  University of California Santa Cruz
3=  Rice
5.   Caltech
6.   Princeton
7.   University of California Santa Barbara
8.   University of California Berkeley
9=  Harvard
9=  Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14.  University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19.  Washington University of St Louis
20.  Boston College

The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.

1.   Harvard
2.   Stanford
3.   MIT
4.   University of California Berkeley
5.   Princeton
6.   Michigan Ann Arbor
7.   University of California San Diego
8.   Yale
9.   University of Pennsylvania
10.   UCLA
11=  Caltech
11=  Columbia
13.   University of Washington
14.   Cornell
15.   Cambridge.
16.   University of California San Francisco
17.   Chicago
18    University of Wisconsin Madison
19    University of Minnesota Twin Cities
20.   Oxford


Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.

1.    MIT
2.    Gottingen
3.    Princeton
4.    Caltech
5.    Stanford
6.    Rice
7.    University of California Santa Barbara
8.    University of California Berkeley
9     Harvard
10   University of California Santa Cruz
11.  EPF Lausanne
12.  Yale
13   University of California San Francisco
14.  Chicago
15.  University of California San Diego
16.  Northwestern
17.  University of  Colorado Boulder
18.  Columbia
19.  University of Texas Austin
20.  UCLA




Wednesday, December 12, 2007

Student Faculty Ratios

Something especially striking about the THES~QS rankings this year is that British universities have done spectacularly well overall while getting miserable scores, comparatively speaking, on the citations section. We have to remember that this component does not measure the absolute numbers of citations but the number per faculty. It is then worth investigating whether the high score for student faculty ratios are the result of inflated faculty numbers which have also led to a reduced score for citations per faculty. First, I want to look at the faculty data for the top British and American universities.

Cambridge

Looking at the QS website we find that they claim that Cambridge has a total of 3,765 Full Time Equivalent (FTE) faculty. The data was entered on 23/8/07 by Saad Shabbir, presumably an employee of QS.

Going to the Cambridge site we find that as of July, 2005, Cambridge had 1,558 academic staff, 1,167 academic-related staff (presumably in computers, administration, libraries and so on and probably also research) and 2,497 contract research staff. Adding the first and third categories and leaving out the second, gives us 4,055, close to QS’s figure for total faculty.

It seems reasonable then to conclude that QS added academic staff to research contract staff and made an adjustment to arrive at a Full Time Equivalent (FTE) number to come up with the total faculty. No doubt they got more up to date information than is available on the university website.

With 18,309 FTE students this gives us a student faculty ratio of 4.9. This is much better than the data from third party sources. The Higher Education Statistics Agency (HESA) provides a figure of 11.9.

It looks like QS have counted both teaching staff and contract research staff who do little or no teaching as faculty members.

Oxford

According to QS Oxford has 3,942 FTE faculty (data entered by Saad Shabbir 21/08/07) and 18,667 FTE students, a ratio of 4.7 students per faculty.

According to Oxford there were (July 2006) 1,407 academic staff, 612 in administration and computing, 169 library and museum staff, 753 in university funded research, 2,138 in externally funded research and 15 in self-funded research (all FTE). All this adds up to 4,094, very close to QS’s figure. It seems that for Oxford, QS has included research and other staff in the total faculty.

According to HESA Oxford has 13 students per faculty.


Imperial College London

The QS site indicates that Imperial has 2,963 FTE faculty and 12,025 FTE students (data entered by Saad Shabbir 21/08/07), a ratio of 3.03.

The Imperial site indicates 1,114 academic staff and 1,856 research staff (FTE 2006-7), a total of 2,970 academic and research staff combined. It would seem that QS have again counted research staff as faculty. This site refers to a 12,509 student load and a student staff ratio of 11.2. The HESA ratio is 9.4.

Harvard

According to QS, the Harvard faculty headcount is 3,369 (data entered by Baerbel Eckelmann 8/07/07). There were 29,000 students by headcount (FTE 16,520).The headcount student faculty ratio is 8.6.

According to the United States News and World Report (USNWR), 8% of Harvard’s faculty are part-time. If part time means doing a quarter of a full time teaching load this means that Harvard’s FTE faculty would be 3,406.The FTE student faculty would then be 4.8.

The Harvard site, however, refers to a much smaller number of faculty, 2,497 non-medical faculty and to 20,042 students, making a ratio of 8.0.The USNWR indicates a ratio of 7 for Harvard (2005).


Something strange about QS’s data is that it refers to a headcount of 13,078 and 3,593 FTE undergraduates. This is something that definitely needs explaining.


Yale

According to QS, the number of faculty by headcount is 3,248. The number of students is 11, 851 by headcount and 10,845 FTE. The headcount student faculty ratio is then 3.6.

According to the Yale site, there are 3,384 faculty and 11,358 students, a ratio of 3.4. (All figures from the 2006-7 academic year.)

For the fall of 2006 the faculty headcount included:

Tenured faculty 906

Term 966

Nonladder 903

Research 609

The USNWR ratio for Yale is 6.

Princeton

According to QS, the faculty headcount was 1,263 (entered by Baerbel Eckelmann 09/07/07). The number of students was 6,708 by headcount and 6,795 FTE. The headcount ratio is 5.3

According to the Princeton site, there are more than 850 FTE faculty and 7,055 students, a ratio of 8.3. USNWR has a ratio of 5.

Conclusion

It seems that QS’s policy is to include any sort of research staff, whether or not they do any teaching, in the category of faculty. In some cases, other professional non-teaching staff are also included. This produces student faculty ratios that are noticeably better than those that can be calculated from, and sometimes specifically stated in, the universities’ web sites or that are provided by other sources. It looks as though British universities have benefited from this more than their American counterparts.

This means, very ironically, that this measure, which is supposed to be a proxy for teaching quality, is to a large extent a reflection of a university’s commitment to research since the employment of large numbers of researchers, or even librarians and computer programmers, would lead to an improvement in this ratio.


It also looks as though leading British universities are favoured disproportionately by this procedure although a definite conclusion would have to wait more extensive analysis.


I think that we can put forward a working hypothesis that British universities have been ascribed inflated faculty numbers and that this contributes to high scores for teaching quality as measured by student faculty radio and to low scores for research as measured by citations per faculty.

Sunday, September 09, 2012

Will There be a New Number One?

One reason why QS and Times Higher Education get more publicity than the Shanghai ARWU, HEEACT and other rankings is that they periodically produce interesting surprises. Last year Caltech replaced Harvard as number one in the THE rankings and Tokyo overtook Hong Kong as the best Asian university. Two years ago Cambridge pushed Harvard aside at the top of the QS rankings.

Will there be another change this year?

There is an intriguing interview with Les Ebdon, the UK government's "university access tsar", in the Daily Telegraph. Ebdon claims that leading British universities are in danger  of losing their world class status unless they start admitting more students from state schools who may be somewhat less academically qualified. Perhaps he knows something.

So if Cambridge slips and is replaced by Harvard, MIT or Yale as QS number one (if it is Oxford or Imperial QS will lose all credibility) we can expect comments that Cambridge should start listening to him before its too late.

I suspect that if there is a new number one it might have something to do with the QS employer review. Since this is a sign up survey and since the numbers are quite small it would not take many additional responses to push Harvard or MIT into first place.

With regard to THE, the problem there is that normalising everything by country, year and/or field is a potential source of instability. If there is a vigorous debate with lots of  citations about an obscure article by a Harvard researcher in a little cited field it could dramatically boost the score on the citations indicator.

Getting a good score in the THE rankings also depends on what a school is being compared to. Last year, Hong Kong universities slumped because they were taken out of China (with low average scores) and classified as a separate country (with high average scores), so that their relative scores were lower. If they are put back in China they will go up this year and there will be a new number one Asian university.

So anybody want to bet on Harvard making a come back this year? Or Hong Kong regaining the top Asian spot from Tokyo in the THE rankings?

Thursday, June 19, 2014

The New Highly Cited Researchers List

Citations have become a standard feature of global university rankings, although they are measured in very different ways. Since 2003 the Shanghai Academic Ranking of World Universities has used the list of highly cited researchers published by Thomson Reuters (TR), who have now prepared a new list of about 3,500 names to supplement the old one which has 7,000 plus.

The new list got off to a bad a start in 2013 because the preliminary list was based on a faulty procedure and because of problems with the assigning of papers to fields or subfields. This led to ARWU having to repeat the 2012 scores for their highly cited researchers indicator in their 2013 rankings.

The list contains a number of researchers who appear more than once. Just looking at the number of Harvard researchers for a few minutes, I have noticed that David M Sabatini, primary affiliation MIT with secondary affiliations at Broad Institute Harvard and MIT, is listed  for Biology and Biochemistry and also for Molecular Biology and Genetics.

Eric S Lander, primary affiliations with Broad Institute Harvard and MIT and secondary affiliations with MIT and Harvard, is listed three times for  Biology and Biochemistry, Clinical Medicine and Molecular Biology and Genetics.

Frank B Hu, primary affiliation with Harvard and secondary affiliation with King Abdulaziz University, Saudi Arabia, is listed under Agricultural Sciences, Clinical Medicine and Molecular Biology and Genetics.

This no doubt represents the reality of scientific research in which a single researcher might well excel in two or more closely related fields but if ARWU are just going to count the number of researchers in the new list there will be distortions if some are counted more than once.

The new list refers to achievements over the period 2002-12. Unlike the old list, which just counted the number of citations, the new one is based on normalisation by field -- 21 in this case -- and by year of publication. In other words, it is not the number of citations that matters but the numbers in relation to the world average for field and year of citation.

TR acknowledge that there is a problem resulting from the growing number of massively cited, multi-authored papers and reviews, especially in the subfields of Particle and High-Energy Physics. To deal with this issue they have excluded from the analysis papers in Physics with more than thirty institutional addresses.

I do not know if TR are planning on doing this for their data for the Times Higher Education World University Rankings. If they are, places like Panjab University are in for a nasty shock.

Another noticeable thing about the new lists is the large number of  secondary affiliations. In many cases the joint affiliations seem quite legitimate. For example, there are many researchers in subjects such as Biology and Biochemistry with affiliation to an Ivy League school and a nearby hospital or research institute. On the other hand, King Abdulaziz University in Jeddah has 150 secondary affiliations. Whether Thomson Reuters or ARWU will be able to determine that these represent a genuine association is questionable.

The publication of the new lists is further evidence  that citations can be used to measure very different things. It would be unwise for any ranking organisation to use only one citations based indicator or only one database.





Saturday, December 09, 2023

Global Subject Rankings: The Case of Computer Science

Three ranking agencies have recently released the latest editions of their subject rankings: Times Higher Education, Shanghai Ranking, and Round University Rankings.  

QS, URAP, and National Taiwan University also published subject rankings earlier in the year. The US News global rankings announced last year can be filtered for subject. The methods are different and consequently the results are also rather different. It is instructive to focus on the results for a specific field, computer science and on two universities, Oxford and Tsinghua. Note that the scope of the rankings is sometimes different.

 

1.   Times Higher Education has published rankings of eleven broad subjects using the same indicators as in their world rankings, Teaching, Research Environment, Research Quality, International Outlook, and Industry: Income and Patents, but with different weightings. For example, Teaching has a weighting of 28% for the Engineering rankings and Industry: Income and Patents 8%, while for Arts and Humanities the weightings are 37.5% and 3% respectively.

These rankings continued to be led by the traditional Anglo-American elite. Harvard is in first place for three subjects, Stanford, MIT, and Oxford in two each and Berkeley and Caltech in one each.

The top five for Computer Science are:

1.    University of Oxford

2.    Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.    ETH Zurich.

Tsinghua is 13th.

 

2.   The Shanghai subject rankings are based on these metrics: influential journal publications, category normalised citation impact, international collaboration, papers in Top Journals or Top Conferences, and faculty winning significant academic awards.

According to these rankings China is now dominant in Engineering subjects. Chinese universities lead in fifteen subjects although Harvard, MIT and Northwestern University lead for seven subjects. The Natural Sciences, Medical Sciences, and Social Sciences are still largely the preserve of American and European universities.

Excellence in the Life Sciences appears to be divided between the USA and China. The top positions in Biology, Human Biology, Agriculture, and Veterinary Science are held respectively by Harvard, University of California San Francisco, Northwest Agriculture and Forestry University, and Nanjing Agricultural University.

The top five for Computer Science and Engineering are:

1.    Massachusetts Institute of Technology

2.    Stanford University

3.    Tsinghua University

4.    Carnegie Mellon University

5.    University of California Berkeley.

Oxford is 9th.

 

3.  The Round University Rankings (RUR), now published from Tbilisi, Georgia, are derived from 20 metrics grouped in 5 clusters, Teaching, Research, International Diversity, and Financial Sustainability. The same methodology is used for rankings in six broad fields. Here, Harvard is in first place for Medical Sciences, Social Sciences, and Technical Sciences, Caltech for Life Sciences, and University of Pennsylvania for Humanities.

RUR’s narrow subject rankings, published for the first time, use different criteria related to publications and citations: Number of Papers, Number of Citations, Citations per Paper, Number of Citing Papers, and Number of Highly Cited Papers. In these rankings, first place goes to twelve universities in the USA, eight in Mainland China, three in Singapore, and one each in Hong Kong, France, and the UK.

 The top five for Computer Science are:

1.    National University of Singapore

2.    Nanyang Technological University

3.    Massachusetts Institute of Technology

4.    Huazhong University of Science and Technology

5.    University of Electronic Science and Technology of China.

Tsinghua is 10th.  Oxford is 47th.

 

4.   The QS World University Rankings by Subject are based on five indicators: Academic reputation, Employer reputation, Research citations per paper, H-index and International research network.  At the top they are mostly led by the usual suspects, MIT, Harvard, Stanford, Oxford, and Cambridge.

The top five for Computer Science and Information Systems

1.    Massachusetts Institute of Technology

2.    Carnegie Mellon University

3.    Stanford University

4.    University of California Berkeley

5.    University of Oxford.

Tsinghua is 15th.

 

5.   University Ranking by Academic Performance (URAP) is produced by a research group at the Middle East Technical University, Ankara, and is based on publications, citations, and international collaboration. Last July it published rankings of 78 subjects.  

 The top five for Information and Computing Sciences were:

1.    Tsinghua University

2.    University of Electronic Science and Technology of China

3.   Nanyang Technological University

4.   National University of Singapore

5.   Xidian University

Oxford is 19th

 

6.    The US News Best Global Universities can be filtered by subject. They are based on publications, citations and research reputation.

The top five for Computer Science in 2022 were:

1.   Tsinghua University

2.   Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.   University of California Berkeley

Oxford was 11th.

 

7.    The National Taiwan University Rankings are based on articles, citations, highly cited papers, and H-index.

The top five for Computer Science are:

1.    Nanyang Technological University

2.    Tsinghua University

3.    University of Electronic Science and Technology of China

4.   National University of Singapore

5.    Xidian University

Oxford is 111th

 

So, Tsinghua is ahead of Oxford for computer science and related fields in the Shanghai Rankings, the Round University Rankings, URAP, the US News Best Global Universities, and the National Taiwan University Rankings. These rankings are entirely or mainly based on research publications and citations. Oxford is ahead of Tsinghua in both the QS and THE subject rankings. The contrast between the THE and the Taiwan rankings is especially striking.