Showing posts sorted by date for query MIT. Sort by relevance Show all posts
Showing posts sorted by date for query MIT. Sort by relevance Show all posts

Monday, May 11, 2015

The Geography of Excellence: the Importance of Weighting


So finally, the 2015 QS subject rankings were published. It seems that the first attempt was postponed when the original methodology produced implausible fluctuations, probably resulting from the volatility that is inevitable when there are a small number of data points -- citations and survey responses -- outside the top 50 for certain subjects.

QS have done some tweaking, some of it aimed at smoothing out the fluctuations in the responses to their academic and employer surveys.

These rankings look at bit different from the World University Rankings. Cambridge has the most top ten placings (31), followed by Oxford and Stanford (29 each), Harvard (28), Berkeley (26) and MIT (16).

But in the world rankings MIT is in first place, Cambridge second, Imperial College London third, Harvard fourth and Oxford and University College London joint fifth.

The subject rankings use two indicators from the world, the academic survey and the employer survey but not internationalisation, student faculty ratio and citations per faculty. They add two indicators, citations per paper and h-index.

The result is that the London colleges do less well in the subject rankings since they do not benefit from their large numbers of international students and faculty. Caltech, Princeton and Yale also do relatively badly probably because the new rankings do not take account of their low faculty student faculty ratios.

The lesson of this is that if weighting is not everything, it is definitely very important.

Below is a list of universities ordered by the number of top five placings. There are signs of the Asian advance --  Peking, Hong Kong and the National University of Singapore -- but it is an East Asian advance.

Europe is there too but it is Cold Europe -- Switzerland, Netherlands and Sweden -- not the Mediterranean.


RankUniversityCountryNumber of Top Five Places
1   HarvardUSA26
2CambridgeUK20
3OxfordUK18
4   StanfordUSA17
5=MITUSA16
5=UC BerkeleyUSA16
7London School of EconomicsUK7
8=University College LondonUK3
8=ETH ZurichSwitzerland 3
10=New York UniversityUSA2
10=Yale  USA2
10=Delft University of TechnologyNetherlands2
10=National University of SingaporeSingapore2
10=UC Los AngelesUSA2
10=UC DavisUSA2
10=Cornell USA2
10=Wisconsin - MadisonUSA2
10-MichiganUSA2
10=Imperial College LondonUK2
20=WagenginenNetherlands1
20=University of Southern California USA1
20=Pratt Institute, New YorkUSA1
20=Rhode Island School of DesignUSA1
20=Parsons: the New School for Design USA1
20=Royal College of Arts LondonUK1
20=MelbourneAustralia1
20=Texas-AustinUSA1
20=Sciences PoFrance1
20=PrincetonUSA1
20=YaleUSA1
20=ChicagoUSA1
20=ManchesterUK1
20=University of PennsylvaniaUSA1
20=DurhamUK1
20=INSEADFrance1
20=London Business SchoolUK1
20=NorthwesternUSA1
20=UtrechtNetherlands1
20=GuelphCanada1
20=Royal Veterinary College LondonUK1
20=UC San FranciscoUSA1
20=Johns  HopkinsUSA1
20=KU LeuvenUSA1
20=GothenburgSweden1
20=Hong KongHong Kong1
20=Karolinska InstituteSweden1
20=SussexUK1
20=Carnegie Mellon UniversityUSA1
20=RutgersUSA1
20=PittsburghUSA1
20=PekingChina1
20=PurdueUSA1
20=Georgia Institute ofTechnologyUSA1
20=EdinburghUK1

Tuesday, October 07, 2014

The Times Higher Education World University Rankings





Publisher

Times Higher Education



Scope

Global. Data provided for 400 universities. Over 800 ranked.


Top Ten


PlaceUniversity
1California Institute of Technology (Caltech) 
2Harvard University
3Oxford University
4Stanford University
5Cambridge University
6Massachusetts Institute of Technology (MIT)                       
7Princeton University
8University of California Berkeley
9=Imperial College London
9=Yale University



Countries with Universities in the Top Hundred


Country      Number of Universities
USA45
UK11
Germany6
Netherlands                                              6
Australia5
Canada4
Switzerland3
Sweden3
South Korea3
Japan2
Singapore2
Hong Kong2
China2
France2
Belgium2
Italy1
Turkey1



Top Ranked in Region


North America 
California Institute of Technology (Caltech)
AfricaUniversity of Cape Town
EuropeOxford University
Latin AmericaUniversidade de Sao Paulo                                    
AsiaUniversity of Tokyo                                 
Central and Eastern Europe  Lomonosov Moscow State University                                   
Arab WorldUniversity of Marrakech Cadi Ayyad                                    
Middle EastMiddle East Technical University                                 
OceaniaUniversity of Melbourne                              



Noise Index

In the top 20, this year's THE world rankings are less volatile than the previous edition and this year's QS rankings. They are still slightly less stable than the Shanghai rankings.


RankingAverage Place Change
 of Universities in the top 20 
THE World rankings 2013-140.70
THE World Rankings 2012-20131.20
QS World Rankings 2013-20141.45
ARWU 2013 -2014 0.65
Webometrics 2013-20144.25
Center for World University Ranking (Jeddah)
2013-2014 
0.90


Looking at the top 100 universities, the  THE rankings are more stable than last year. The average university in the top 100 in 2013 rose or fell 4.34 places. The QS rankings are now more stable than the THE or Shanghai rankings.

RankingAverage Place Change
 of Universities in the top 100 
THE World Rankings 2013-20144.34
THE World Rankings 2012-20135.36
QS World Rankings 2013-143.94
ARWU 2013 -2014 4.92
Webometrics 2013-201412.08
Center for World University Ranking (Jeddah)
2013-2014 
10.59


Note: universities falling out of the top 100 are treated as though they fell to 101st position.


Methodology

See here




Thursday, October 02, 2014

Which universities have the greatest research influence?

Times Higher Education (THE) claims that its Citations:Research Influence indicator, prepared by Thomson Reuters (TR), is the flagship of its World University Rankings, It is strange then that the magazine has never published a research influence ranking although that ought to be just as interesting as its Young Universities Ranking, Reputation Rankings or gender index.

So let's have a look at the top 25  universities in the world this year ranked for research influence,  measured by field- and year- normalised citations, by Thomson Reuters.

Santa Cruz and Tokyo Metropolitan have the same impact as MIT. Federico Santa MariaTechnical University is ahead of Princeton. Florida Institute of Technology beats Harvard. Bogazici University and Scuola Normale Superiore do better than Oxford and Cambridge.

Are they serious?

Apparently. There will be an explanation in the next post. Meanwhile go and check if you don't believe me. And let me know if there's any dancing in the streets of Valparaiso, Pisa, Golden or Istanbul.


Rank and Score for Citations: Research Influence 2014-15 THE World Rankings

Rank University Score
1= University of California Santa Cruz 100
1= MIT 100
1= Tokyo Metropolitan University 100
4 Rice University 99.9
5= Caltech 99.7
5= Federico Santa Maria Technical University, Chile  99.7
7 Princeton University 99.6
8= Florida Institute of Technology 99.2
8= University of California Santa Barbara 99.2
10= Stanford University 99.1
10= University of California Berkeley 99.1
12= Harvard University 98.9
12= Royal Holloway University of London 98.9
14 University of Colorado Boulder  97.4
15 University of Chicago 97.3
16= Washington University of St Louis 97.1
16= Colorado School of Mines 97.1
18 Northwestern University 96.9
19 Bogazici University, Turkey  96.8
20 Duke University  96.6
21= Scuola Normale Superiore Pisa, Italy 96.4
21= University of California San Diego 96.4
23 Boston College 95.9
24 Oxford University 95.5
25= Brandeis University  95.3
25= UCLA 95.3

Thursday, September 25, 2014

How the Universities of Huddersfield, East London, Plymouth, Salford, Central Lancashire et cetera helped Cambridge overtake Harvard in the QS rankings

It is a cause of pride for the great and the good of British higher education that the country's universities  do brilliantly in certain global rankings. Sometimes though, there is puzzlement about how UK universities can do so well even though the  performance of the national economy  and the level of adult cognitive skills are so mediocre.

In the latest QS World University Rankings Cambridge and Imperial College London pulled off a spectacular feat when they moved ahead of Harvard into joint second place behind MIT, an achievement at first glance as remarkable as Leicester City beating Manchester United. Is this a tribute to the outstanding quality of teaching, inspired leadership or cutting edge research, or perhaps something else?

Neither Cambridge nor Imperial does very well in the research based rankings. Cambridge is 18th and Imperial 26th among higher education institutions in the latest Scimago rankings for output and 32nd and 33rd for normalised impact (citations per paper adjusted for field). Harvard is 1st and 4th for these indicators. In the CWTS Leiden Ranking, Cambridge is 22nd and Imperial 32nd for the mean normalised citation score, sometimes regarded as the flagship of these rankings, while Harvard is 6th.

It is true that Cambridge does much better on the Shanghai Academic Ranking of World Universities with fifth place overall, but that is in large measure due to an excellent score, 96.6, for alumni winning Nobel and Fields awards, some dating back several decades. For Highly Cited Researchers and publications in Nature and Science its performance is not nearly so good.

Looking at the THE World University Rankings, which make some attempt to measure factors other than research, Cambridge and Imperial come in 7th and 10th overall, which is much better than they do in the Leiden and Scimago rankings. However, it is very likely that the postgraduate teaching and research surveys made a significant contribution to this performance. Cambridge is 4th in the THE reputation rankings based on last year's data and Imperial is 13th.

Reputation is also a key to the success of Cambridge and Imperial in the QS world rankings. Take a look at the scores and positions of Harvard, Cambridge and Imperial in the rankings just released.

Harvard  gets 100 points (2nd place) for the academic survey, employer survey (3rd), and citations per faculty (3rd). It has 99.7 for faculty student ratio (29th), 98.1 for international faculty (53rd), and 83.8 for international students (117th). Harvard's big weakness is its relatively small percentage of international students.

Cambridge is in first place for the academic survey and 2nd in the employer survey, in both cases with a score of 100 and one place ahead of Harvard. The first secret of Cambridge's success is that it does much better on reputational measures than for bibliometric or other objective data. It was 18th for faculty student ratio, 73rd for international faculty, 50th for international students and 40th for citations per faculty.

So, Cambridge is ahead for faculty student ratio and international students and Harvard is ahead for international faculty and citations per faculty. Both get 100 for the two surveys.

Similarly, Imperial has 99.9 points for the academic survey (14th), 100 for the employer survey (7th), 99.8 for faculty student ratio (26th), 100 for international faculty (41st), 99.7 (20th) for international students and 96.2 (49th) for citations per faculty. It is behind Harvard for citations per faculty but just enough ahead for international students to squeeze past into joint second place.

The second secret is that QS's standardisation procedure combined with an expanding database means that the scores of the leading universities in the rankings are getting more and more squashed together at the top. QS turns its raw data into Z scores so that universities are measured according to their distance in standard deviations from the mean for all ranked universities. If the number of sub-elite universities in the rankings increases then the overall means for the indicators will fall and the scores of universities at the top end will rise as their distance in standard deviations from the mean increases.

Universities with scores of 98 and 99 will now start getting scores of 100. Universities with recorded scores of 100 will go on getting 100, although they might go up up a few invisible decimal points

In 2008, QS ranked 617 universities. In that year, nine universities had a score of 100 for the academic survey, four for the employer survey, nine for faculty student ratio, six for international faculty, six for international students and seven for citations per faculty.

By 2014 QS was ranking over 830 universities (I assume that those at the end of the rankings marked "NA" are there because they got votes in the surveys but are not ranked because they fail to meet the criteria for inclusion). For each indicator the number of universities getting a score of 100 increased. In 2014 there were 13 universities with a score of 100 for the academic survey, 14 for the employer survey, 16 for faculty student ratio, 41 for international faculty, 15 for international students and 10 for citations per faculty,

In 2008 Harvard got the same score as Cambridge for the academic and employer surveys. It was 0.3 (0.06 weighted) behind for faculty student ratio, 0.6 (0.53 weighted) behind for international faculty, and 14.1 (0.705 weighted) behind for international students, It was, however, 11.5 points. (2.3 weighted) ahead for citations per faculty. Harvard was therefore first and Cambridge third.

By 2014 Cambridge had fallen slightly behind Harvard for international faculty. It was slightly ahead for faculty student ratio. Scores for the survey remained the same, 100 for both places. Harvard reduced the gap for international students slightly.

What made the difference in 2014 and put Cambridge ahead of Harvard was that in 2008 Harvard  in fifth place for citations and with a score 100 was 11.5 (2.3 weighted) points ahead of Cambridge. In 2014 Cambridge had improved a bit for this indicator -- it was 40th instead of 49th -- but now got 97.9 points reducing the difference with Harvard to 2.1 points (0.42 weighted). That was just enough to let Cambridge overtake Harvard.

Cambridge's rise between 2008 and 2014 was thus largely due to the increasing number of ranked universities which led to lower means for  each indicator which led to higher Z scores at the top of each indicator and so reduced the effect of Cambridge's comparatively lower citations per faculty score.

The same thing happened to Imperial . It did a bit better for citations, rising from 58th to 49th place and this brought it a rise in points from 83.10 to 96.20 again allowing it to creep past Harvard.

Cambridge and Harvard should be grateful to those universities filling up the 701+ category at the bottom of the QS rankings. They are the invisible trampoline that propelled "Impbridge" into second place, just behind MIT.

QS should think carefully about adding more universities to their rankings. Another couple of hundred and there will be a dozen universities at the top getting 100 for everything.









Thursday, September 18, 2014

QS World University Rankings 2014



Publisher

QS (Quacquarelli Symonds)



Scope

Global. 701+ universities.


Top Ten


PlaceUniversity
1MIT
2=Cambridge
2=Imperial College London
4Harvard
5Oxford
6University College London
7Stanford
8California Institute of Technology (Caltech)
9Princeton
10Yale



Countries with Universities in the Top Hundred


Country      Number of Universities
USA28
UK19
Australia8
Netherlands                                              7
Canada5
Switzerland4
Japan4
Germany3
China3
Korea3
Hong Kong3
Denmark2
Singapore2
France2
Sweden2
Ireland1
Taiwan1
Finland1
Belgium1
New Zealand1



Top Ranked in Region


North America 
MIT
AfricaUniversity of Cape Town
EuropeCambridge
Imperial College London
Latin AmericaUniversidade de Sao Paulo                                    
AsiaNational University of Singapore                                    
Central and Eastern Europe  Lomonosov Moscow State University                                   
Arab WorldKing Fahd University of Petroleum and Minerals                                     
Middle EastHebrew University of Jerusalem                                  



Noise Index

In the top 20, this year's QS world rankings are less volatile than the previous edition but more so than the THE rankings or Shanghai ARWU. The top 20 universities in 2013 rose or fell an average of 1.45 places. The most remarkable change was the rise of Imperial College and Cambridge to second place behind MIT and ahead of Harvard.


RankingAverage Place Change
 of Universities in the top 20 
QS World Rankings 2013-20141.45
QS World Rankings 2012-20131.70
ARWU 2013 -2014 0.65
Webometrics 2013-20144.25
Center for World University Ranking (Jeddah)
2013-2014 
0.90
THE World Rankings 2012-20131.20


Looking at the top 100 universities, the  QS rankings  are little different from last year. The average university in the top 100 moved up or down 3.94 places compared to 3.97 between 2012 and 2013. These rankings are more reliable than this year's ARWU, which was affected by the new lists of highly cited researchers, and last year's THE rankings.

RankingAverage Place Change
 of Universities in the top 100 
QS World Rankings 2013-143.94
QS World Rankings 2012-20133.97
ARWU 2013 -2014 4.92
Webometrics 2013-201412.08
Center for World University Ranking (Jeddah)
2013-2014 
10.59
THE World Rankings 2012-20135.36




Methodology (from topuniversities)

1. Academic reputation (40%)

Academic reputation is measured using a global survey, in which academics are asked to identify the institutions where they believe the best work is currently taking place within their field of expertise.
For the 2014/15 edition, the rankings draw on almost 63,700 responses from academics worldwide, collated over three years. Only participants’ most recent responses are used, and they cannot vote for their own institution. Regional weightings are applied to counter any discrepancies in response rates.
The advantage of this indicator is that it gives a more equal weighting to different discipline areas than research citation counts. Whereas citation rates are far higher in subjects like biomedical sciences than they are in English literature, for example, the academic reputation survey weights responses from academics in different fields equally.
It also gives students a sense of the consensus of opinion among those who are by definition experts. Academics may not be well positioned to comment on teaching standards at other institutions, but it is well within their remit to have a view on where the most significant research is currently taking place within their field.

2. Employer reputation (10%)

The employer reputation indicator is also based on a global survey, taking in almost 28,800 responses for the 2014/15 edition. The survey asks employers to identify the universities they perceive as producing the best graduates. This indicator is unique among international university rankings.
The purpose of the employer survey is to give students a better sense of how universities are viewed in the job market. A higher weighting is given to votes for universities that come from outside of their own country, so it’s especially useful in helping prospective students to identify universities with a reputation that extends beyond their national borders. 

3. Student-to-faculty ratio (20%)

This is a simple measure of the number of academic staff employed relative to the number of students enrolled. In the absence of an international standard by which to measure teaching quality, it provides an insight into the universities that are best equipped to provide small class sizes and a good level of individual supervision.

4. Citations per faculty (20%)

This indicator aims to assess universities’ research output. A ‘citation’ means a piece of research being cited (referred to) within another piece of research. Generally, the more often a piece of research is cited by others, the more influential it is. So the more highly cited research papers a university publishes, the stronger its research output is considered.
QS collects this information using Scopus, the world’s largest database of research abstracts and citations. The latest five complete years of data are used, and the total citation count is assessed in relation to the number of academic faculty members at the university, so that larger institutions don’t have an unfair advantage.

5  6. International faculty ratio (5%)  international student ratio (5%)

The last two indicators aim to assess how successful a university has been in attracting students and faculty members from other nations. This is based on the proportion of international students and faculty members in relation to overall numbers. Each of these contributes 5% to the overall ranking results.

Wednesday, September 10, 2014

America's Best Colleges

The US News & World Report's America's Best Colleges has just been published. There are no surprises at the top. Here are the top ten.

1.  Princeton
2.  Harvard
3.  Yale
4= Columbia
4= Stanford
4= Chicago
7.   MIT
8= Duke
8= University of Pennsylvania
10. Caltech

Analysis at the Washington Post indicates little movement at the top. Outside the elite there are some significant changes.

Liberal arts colleges
St. John's College, Annapolis from 123rd to 56th .
Bennington College from 122nd to 89th.

National universities
Northeastern University from 69th to 42nd.
Texas Christian University from 99th to 46th.




Sunday, August 17, 2014

The Shanghai Rankings (Academic Ranking of World Universities) 2014 Part 1

Publisher

Center for World-Class Universities, Shanghai Jiao Tong  University


Scope

Global. 500 institutions.


Methodology

See ARWU site.

In contrast to the other indicators, the Highly Cited Researchers indicator has undergone substantial changes in recent years, partly as a result of changes by data provider Thomson Reuters. Originally, ARWU used the old list of highly cited researchers prepared by Thomson Reuters (TR), which was first published in 2001 and updated in 2004. Since them no names have been added although changes of affiliation submitted by researchers were recorded. 

Until 2011 when a researcher listed more than one institution as his or her affiliation then credit for the highly cited indicator would be equally divided. Following the recruitment of a large number of part time researchers by King Abdulaziz University, ARWU introduced a new policy of asking researchers how their time was divided. When there was no response, secondary affiliations were counted as 16%, which was the average time given by those who responded to the survey.

In 2013 TR announced that they were introducing a new list based on field-normalised citations over the period 2002-2012. However, problems with the preparation of the new list meant that it could not be used in the 2013 rankings. Instead, the Shanghai rankings repeated the 2012 scores.

During 2013, KAU recruited over 100 highly cited researchers who nominated the university as a secondary affiliation. That caused some comment by researchers and analysts. A paper by Lutz Bornmann and Johann Bauer concluded that to " counteract attempts at manipulation, ARWU should only consider primary institutions of highly cited researchers."

It seems that Shanghai has acted on this advice: "It is worth noting that, upon the suggestion of many institutions and researchers including some Highly Cited Researchers, only the primary affiliations of new Highly Cited Researchers are considered in the calculation of an institution’s HiCi score for the new list."

As a result, KAU has risen into the lower reaches of the 150-200 band on the basis of publications, some papers in Nature and Science and a modest number of primary affiliations among highly cited researchers. That is a respectable achievement but one that would have been much greater if the secondary affiliations had been included.


Perhaps Shanghai should also take note of the suggestion in a paper by Lawrence Cram and Domingo Docampo that  " [s]ignificant acrimony accompanies some published comparisons between ARWU and other rankings (Redden, 2013) driven in part by commercial positioning .  Given its status as an academic ranking , it may be prudent for ARWU to consider replacing its HiCi indicator with a measure that is nit sourced from a commercial provider if such a product can be found that satisfies the criteria (objective, open, independent ) used by ARWU."


.
Top Ten


Place University
1 Harvard
2 Stanford
3 MIT
4 University of California Berkeley
5 Cambridge
6 Princeton
7 California Institute of Technology (Caltech)
8 Columbia
9= Chicago
9= Oxford



Countries With Universities in the Top 100



Country Number of Universities
United States    52
United Kingdom                                           8
Switzerland 5
Germany 4
France 4
Netherlands 4
Australia 4
Canada 4
Japan 3
Sweden 3
Belgium 2
Israel 2
Denmark 2
Norway 1
Finland 1
Russia 1



Tuesday, August 05, 2014

Webometrics: Ranking Web of Universities 2nd 2014 Edition


The Webometrics rankings are based on web-derived data. They cover more than 22,000 institutions, far more than conventional rankings, and should always be consulted as a check on the plausibility of the others. They are, however, extremely volatile and that reduces their reliability considerably.
Publisher

Cybermetrics Lab, CSIC, Madrid



Scope

Global. 22,000+ institutions.


Methodology

From the Webometrics site.


The current composite indicator is now built as follows:
Visibility (50%)
IMPACT. The quality of the contents is evaluated through a "virtual referendum", counting all the external inlinks that the University webdomain receives from third parties. Those links are recognizing the institutioof conventinal prestige, the academic performance, the value of the information, and the usefulness of the services as introduced in the webpages according to the criteria of millions of web editors from all over the world. The link visibility data is collected from the two most important providers of this information:Majestic SEO and ahrefs. Both use their own crawlers, generating different databases that should be used jointly for filling gaps or correcting mistakes. The indicator is the product of square root of the number of backlinks and the number of domainsoriginating those backlinks, so it is not only important the link popularity but even more the link diversity. The maximum of the normalized results is the impact indicator.
Activity (50%)
PRESENCE (1/3). The total number of webpages hosted in the main webdomain (including all the subdomains and directories) of the university as indexed by the largest commercial search engine (Google). It counts every webpage, including all the formats recognized individually by Google, both static and dynamic pages and other rich files. It is not possible to have a strong presence without the contribution of everybody in the organization as the top contenders are already able to publish millions of webpages. Having additional domains or alternative central ones for foreign languages or marketing purposes penalizes in this indicator and it is also very confusing for external users.
OPENNESS (1/3). The global effort to set up institutional research repositories is explicitly recognized in this indicator that takes into account the number of rich files (pdf, doc, docx, ppt) published in dedicated websites according to the academic search engine Google Scholar. Both the total files Both the total records and those with correctly formed file names are considered (for example, the Adobe Acrobat files should end with the suffix .pdf). The objective is to consider recent publications that now are those published between 2008 and 2012 (new period).
EXCELLENCE (1/3). The academic papers published in high impact international journals are playing a very important role in the ranking of Universities. Using simply the total number of papers can be misleading, so we are restricting the indicator to only those excellent publications, i.e. the university scientific output being part of the 10% most cited papers in their respective scientific fields. Although this is a measure of high quality output of research institutions, the data provider Scimago groupsupplied non-zero values for more than 5200 universities (period 2003-2010). In future editions it is intended to match the counting periods between Scholar and Scimago sources.

Top Ten

1.    Harvard University
2.    MIT
3.    Stanford University
4.    Cornell University
5.    University of Michigan
6.    University of California Berkeley
7=   Columbia University
8=   University of Washington
9.    University of Minnesota
10.  University of Pennsylvania

Countries with Universities in the Top Hundred

USA                      66
Canada                  7
UK                          4  
Germany                3
China                      3
Japan                     2
Switzerland            2
Netherlands           1
Australia                1
Italy                         1
South Korea          1
Taiwan                   1 
Belgium                 1
Hong Kong            1
Brazil                      1 
Austria                   1
Czech Republic    1
Singapore             1        
Mexico                   1



Top Ranked in Region

USA:                             Harvard
Canada:                       Toronto
Latin America:             Sao Paulo
Caribbean                    University of the West Indes
Europe:                        Oxford
Africa:                           University of Cape Town
Asia:                             Seoul National University
South Asia                   IIT Bombay
Southeast Asia           National University of Singapore
Middle East:                Hebrew University of Jerusalem
Arab World:                 King Saud University
Oceania                       Melbourne

Noise Index
Average position change of universities in the top 20 in 2013:

4.25

Comparison

Center for World University Rankings         --  0.90
Shanghai Rankings (ARWU): 2011-12      --  0.15
Shanghai Rankings (ARWU) 2012-13       --  0.25
THE WUR:  2012-13                                    --  1.20
QS  WUR    2012-13                                    --  1.70  


Average position change of universities in the top 100 in 2013

12.08

Comparison

Center for World University Rankings               --  10.59 
 Shanghai Rankings (ARWU): 2011-12               --  2.01
 Shanghai Rankings   2012-13                            --  1.66
THE WUR:  2012-13                                            --   5.36
QS  WUR    2012-13                                            --   3.97


Thursday, July 17, 2014

The CWUR Rankings

The Center for World University Rankings, based in Jeddah, Saudi Arabia, has produced a global ranking of 1,000 universities. Last year and in 2012, 100 universities were ranked. The Center is headed by Nadim Mahassen, an Assistant Professor at King Abdulaziz University.

The rankings include five indicators that measure various aspects of publication and research: Publications in reputable journals, Influence (research papers in highly influential journals), Citations, Broad Impact (h-index) and Patents (h-index).

Altogether these have a weighting of 25%, which seems on the low side for modern world class research universities. The use of the h-index, which reduces the impact of outliers and anomalous cases, is a useful addition to the standard array of indicators. So too is the use of patents filed as a measure of innovation.

Another 25% goes to Quality of Education, which is measured by the number of alumni receiving major international awards relative to size (current number of students according to national agencies). There would appear to be an obvious bias here towards older institutions. There is also a problem that such awards are likely to be concentrated among relatively few universities so that this indicator would  not discriminate among  those outside the world elite.

A quarter is assigned to Quality of Faculty measured by the number of faculty receiving such awards and another quarter to Alumni Employment measured by the number of CEOs of top corporations.

The last three indicators are unlikely to be regarded as satisfactory. The number of CEOs is largely irrelevant to the vast majority of institutions.

In general, these are a useful addition to the current array of global rankings but the non-research indicators are narrow and not very meaningful. There is also a very serious problem with reliability as noted below. 

Now for the standard presentation of rankings, with the addition of a noise analysis. 


Publisher

Center for World University Rankings, Jeddah, Saudi Arabia


Scope

Global. 1,000 universities.

Methodology

Quality of Education (25%) measured by alumni winning international awards relative to size.
Alumni Employment  (25%) measured by CEOs of top companies relative to size.
Quality of Faculty (25%) measured by "academics" winning international awards relative to size.
Publications in reputable journals (5%).
Influence measured by publications in highly influential journals (5%).
Citations measured by the number of highly cited papers (5%).
Broad Impact measured by h-index (5%).
Patents measured by the number of international filings (5%)

Top Ten

1.   Harvard
2.   Stanford
3.   MIT
4.   Cambridge
5.   Oxford
6.   Columbia
7.   Berkeley
8.   Chicago
9.   Princeton
10. Yale

Countries with Universities in the Top Hundred

USA               54
Japan              8
UK                   7
Switzerland    4
France            4
Germany        4
Israel              3
Canada         3
China             2
Sweden         2
South Korea  1
Russia            1
Taiwan           1
Singapore     1
Denmark        1
Netherlands     1
Italy                 1
Belgium         1
Australia        1


Top Ranked in Region

USA:                             Harvard
Canada:                       Toronto
Latin America:             Sao Paulo
Europe:                        Cambridge
Africa:                           Witwatersrand
Asia:                             Tokyo
Middle East:                Hebrew University of Jerusalem
Arab World:                 King Saud University

Noise Index
Average position change of universities in the top 20 in 2013:

0.9

Comparison

Shanghai Rankings (ARWU): 2011-12  --  0.15; 2012-13 --  0.25
THE WUR:  2012-13  --   1.2
QS  WUR    2012-13  --   1.7

Average position change of universities in the top 100 in 2013

10.59

Comparison

Shanghai Rankings (ARWU): 2011-12  --  2.01; 2012-13 --  1.66
THE WUR:  2012-13  --   5.36
QS  WUR    2012-13  --   3.97

The CWUR rankings, once we leave the top 20, are extremely volatile, even more than the THE and QS rankings. This, unfortunately, is enough to undermine their credibility. A pity since there are some promising ideas here.





Tuesday, July 15, 2014

Another New Highly Cited Researchers List

Thomson Reuters have published another document, The World's Most Influential Scientific Minds, which contains the most highly cited researchers for the period 2002-13. This one includes only the primary affiliation of the researchers, not the secondary ones. If the Shanghai ARWU rankings, due in August, use this list rather than the one published previously, they will save themselves a lot of embarrassment.

Over at arxiv, Lutz Bornmann and Johann Bauer have produced a ranking of the leading institutions according to the number of highly cited researchers' primary affiliation. Here are their top ten universities, with government agencies and independent research centres omitted.

1.  University of California (all campuses)
2.  Harvard
3.  Stanford
4.  University of Texas (all campuses)
5.  University of Oxford
6.  Duke University
7.  MIT
8.  University of Michigan (all campuses)
9.  Northwestern University 
10. Princeton

Compared to the old list, used for the Highly Cited indicator in the first Shanghai rankings in 2003, Oxford and Northwestern are doing better and MIT and Princeton somewhat worse.

Bornmann and Bauer have also ranked universities according to the number of primary and secondary affiliations,counting each recorded affiliation as a fraction). The top ten are:

1.  University of California (all campuses)
2.  Harvard
3.  King Abdulaziz University, Jeddah, Saudi Arabia
4.  Stanford
5.  University of Texas 
6.  MIT
7.  Oxford
8.  University of Michigan
9.  University of Washington
10.  Duke

The paper concludes:

"To counteract attempts at manipulation, ARWU should only consider primary 

institutions of highly cited researchers. "




Thursday, June 19, 2014

The New Highly Cited Researchers List

Citations have become a standard feature of global university rankings, although they are measured in very different ways. Since 2003 the Shanghai Academic Ranking of World Universities has used the list of highly cited researchers published by Thomson Reuters (TR), who have now prepared a new list of about 3,500 names to supplement the old one which has 7,000 plus.

The new list got off to a bad a start in 2013 because the preliminary list was based on a faulty procedure and because of problems with the assigning of papers to fields or subfields. This led to ARWU having to repeat the 2012 scores for their highly cited researchers indicator in their 2013 rankings.

The list contains a number of researchers who appear more than once. Just looking at the number of Harvard researchers for a few minutes, I have noticed that David M Sabatini, primary affiliation MIT with secondary affiliations at Broad Institute Harvard and MIT, is listed  for Biology and Biochemistry and also for Molecular Biology and Genetics.

Eric S Lander, primary affiliations with Broad Institute Harvard and MIT and secondary affiliations with MIT and Harvard, is listed three times for  Biology and Biochemistry, Clinical Medicine and Molecular Biology and Genetics.

Frank B Hu, primary affiliation with Harvard and secondary affiliation with King Abdulaziz University, Saudi Arabia, is listed under Agricultural Sciences, Clinical Medicine and Molecular Biology and Genetics.

This no doubt represents the reality of scientific research in which a single researcher might well excel in two or more closely related fields but if ARWU are just going to count the number of researchers in the new list there will be distortions if some are counted more than once.

The new list refers to achievements over the period 2002-12. Unlike the old list, which just counted the number of citations, the new one is based on normalisation by field -- 21 in this case -- and by year of publication. In other words, it is not the number of citations that matters but the numbers in relation to the world average for field and year of citation.

TR acknowledge that there is a problem resulting from the growing number of massively cited, multi-authored papers and reviews, especially in the subfields of Particle and High-Energy Physics. To deal with this issue they have excluded from the analysis papers in Physics with more than thirty institutional addresses.

I do not know if TR are planning on doing this for their data for the Times Higher Education World University Rankings. If they are, places like Panjab University are in for a nasty shock.

Another noticeable thing about the new lists is the large number of  secondary affiliations. In many cases the joint affiliations seem quite legitimate. For example, there are many researchers in subjects such as Biology and Biochemistry with affiliation to an Ivy League school and a nearby hospital or research institute. On the other hand, King Abdulaziz University in Jeddah has 150 secondary affiliations. Whether Thomson Reuters or ARWU will be able to determine that these represent a genuine association is questionable.

The publication of the new lists is further evidence  that citations can be used to measure very different things. It would be unwise for any ranking organisation to use only one citations based indicator or only one database.





Thursday, June 05, 2014

The World's Top Global Thinkers

And finally the results are out. The world's leading thinker, according to a poll conducted by Prospect magazine, is the economist Armatya Sen, followed by Raghuram Rajan, Governor of the Reserve Bank of India, and the novelist Arundhati Roy.

Sen received degrees from the the University of Calcutta and Cambridge and has taught at Jadavpur University, LSE, Oxford, Cambridge and Harvard. Rajan has degrees from IIT Delhi, the Indian Institute of Management Ahmedabad and MIT. Roy studied architecture at the School of Architecture and Planning in Delhi.

The careers of Sen and Ragan illustrate a typical feature of Indian higher education, some excellent undergraduate teaching but somehow the outstanding students end up leaving India.

Prospect notes that the poll received "intense media interest in India" so it would be premature to conclude that the country has become the new global Athens.

The top non-Asian thinker is Pope Francis.

Personally, I am disappointed that the historian Perry Anderson only got 28th place.  I am also surprised that feminist and queer theorist Judith Butler, whose brilliant satire -- Hamas as part of the global left and so on -- is under-appreciated, was only 21st.

Thursday, April 10, 2014

The Parochial World of Global Thinkers

The magazine Prospect has just published its list of fifty candidates for the title of global thinker. It is rather different from last year. Number one in 2013, Richard Dawkins, biologist and atheist spokesman, is out. Jonathon Derbyshire, Managing Editor of Prospect, in an interview with the Digital Editor of Prospect says that is because Dawkins  has been saying the same thing for several years. Presumably Prospect only noticed this year.

The list is top heavy with philosophers and economists and Americans and Europeans. There is one candidate from China, one from Africa, one from Brazil and none from Russia. There is one husband and wife. A large number are graduates of Harvard or have taught there and quite a few are from Yale, MIT, Berkeley, Cambridge and Oxford. One wonders if the selectors made some of their choices by going through the contents pages of New Left Review. So far I have counted six contributors.

There are also no Muslims. Was Prospect worried about a repetition of that unfortunate affair in 2008?

All in all, apart from Pope Francis, this does not look like a global list. Unless, that is, thinking has largely retreated to the humanities and social science faculties of California, New England and Oxbridge.








Tuesday, April 01, 2014

Comparing the THE and QS Reputation Rankings

This year's Times Higher Education (THE) Reputation Rankings were  a bit boring, at least at the top, and that is just what they should be.

The top ten are almost the same as last year. Harvard is still first and MIT is second. Tokyo has dropped out of the top ten to 11th place and has been replaced by Caltech. Stanford is up three places and is now third. Cambridge and Oxford are both down one place. Further down, there is some churning but it is difficult to see any clear and consistent trends, although the media have done their best to find stories, UK universities falling or sliding or slipping, no Indian or Irish or African universities in the top 100.

These rankings may be more interesting for who is not there than for who is. There are some notable absentees from the top 100. Last year Tokyo Metropolitan University was, according to THE and data providers Thomson Reuters (TR), first in the world, along with MIT, for research impact. Yet it fails to appear in the top 100 in a reputation  survey in which research has a two thirds weighting. Rice University, joint first in the world for research impact with Moscow State Engineering Physics Institute  in 2012 is also absent. How is this possible? Am I missing something?

In general, the THE-TR reputation survey, the data collection for which was contracted out  to the pollsters Ipsos Mori CT, appears to be quite rigorous and reliable. Survey forms were sent out to a clearly defined group, researchers with papers in the ISI indexes. THE claim that this means that their respondents must therefore be active producers of academic research. That is stretching it a bit. Getting your name on a article published in a reputable journal might mean a high degree of academic competence or it could  just mean having some sort of influence over the research process. I have heard a report about an Asian university where researchers were urged to put their heads of  department on the list of co-authors. Still, on balance it seems that the respondents to the THE survey are mostly from a stable group, namely those who have usually made some sort of contribution to a research paper of sufficient merit to be included in an academic journal .

TR also appear to have used a systematic approach in sending out the survey forms. When the first survey was being prepared in 2010 they announced that the forms would be emailed according to the number of researchers recorded by UNESCO in 2007. It is not clear if this procedure has been followed strictly over the last four years. Oceania, presumably Australia and New  Zealand, appears to have a very large  number of responses this year, 10%, although TR reported in 2010 that UNESCO found only 2.1 % of the world's researchers in that region.

The number of responses received appears reasonably large although it has declined recently.  In 2013 TR collected 10, 536 responses, considerably less than in 2012 when it was 16,639. Again, it is not clear what happened.

The number of responses from the various subject areas has changed somewhat. Since 2012 the proportion from the social sciences has gone from 19% to 22% as has engineering and technology while life sciences has gone from 16% to 22%.

QS do not publish reputation surveys but it is possible to filter their ranking scores to find out how universities performed on their academic survey.

The QS approach is less systematic. They started out using the subscription lists of World Scientific, a Singapore based academic publishing company with links to Imperial College London. Then they added respondents from  Mardev, a publisher of academic lists, to beef up the number of names in the humanities. Since then the balance has shifted with more names coming from Mardev with some topping up from World Scientific. QS have also added a sign up facility where people are allowed to apply to receive survey forms. That was suspended in April 2013 but has recently been revived. They have also asked universities to submit lists of potential respondents and respondents to suggest further names. The  exact number of responses coming from all these different sources is not known.

Over the last few years QS have made their survey rather more rigorous. First, respondents were not allowed to vote for the universities where they were currently employed. They were restricted to one response per computer and universities were not allowed to solicit votes or instruct staff who to vote for or who not to vote for. Then they were told not to promote any form of participation in the surveys.

In addition to methodological changes, the proportion of responses from different countries has changed significantly since 2007 with a large increase from Latin America, especially Brazil and Mexico, the USA and larger European countries and a fall in those from India, China and the Asia-Pacific region. All of this means that it is very difficult to figure out whether the rise or fall of a university reflects a change in methodology or distribution of responses or a genuine shift in international reputation

Comparing the THE-TR and QS surveys there is some overlap at the top. The top five are the same in both although in a different order: Harvard, MIT, Stanford, Oxford and Cambridge.

After that, we find that the QS academic survey favours universities in Asia-Pacific and  Latin America. Tokyo is seventh according to QS but THE-TR have it in 11th place. Peking is 19th for QS and 41st for THE -TR. Sao Paulo is 51st in the QS indicator but is in the 81-90 band in the THE-TR rankings. The Autonomous National University of Mexico (UNAM) is not even in THE-TR's top 100 but QS put it 48th.

On the other hand Caltech, Moscow State University, Seoul National University and Middle Eastern Technical University do much better with THE-TR than with QS .

I suspect that the QS survey is tapping a younger less experienced pool of respondents from less regarded universities and from countries with high aspirations but so far limited achievements.






Sunday, March 30, 2014

The Nature Publication Index

Nature has long been regarded as the best or one of the two best scientific journals in the world. Papers published there and in Science  account for 20 % of the weighting for Shanghai Jiao Tong University's Academic Ranking of World Universities, the same as Nobel and Fields awards or publications in the whole of the Science Citation and Social Science Citation Indexes.

Sceptics may wonder whether Nature has seen better years and is perhaps sliding away from the pinnacle of scientific publishing. It has had some embarrassing moments in recent decades including the publication of a 1978 paper that gave credence to the alleged abilities of the psychic Uri Geller, the report of a study by Jacques Beneviste and others that purported to show that water has a memory. the questionable "hockey stick" article on global warming in 1998 and seven retracted papers on superconductivity by Jan Hendrik Schon.

But it still seems that Nature is highly regarded by the global scientific community and that the recent publication of the Nature Publication Index is a reasonable guide to current trends in scientific research. This counts the number of publications in Nature in 2013.

The USA remains on top with Harvard first, MIT second and Stanford third although China continues to make rapid progress. For many parts of the world, Latin America, Southern Europe, Africa, scientific achievement is extremely limited. Looking at the Asia-Pacific rankings  much of the region including Indonesia, Bangladesh and the Philippines is almost a scientific desert.