Monday, August 31, 2015

Update on changes in ranking methodology

Times Higher Education (THE) have been preparing the ground for methodological changes in their world rankings. A recent article by Phil Baty  announced that the new world rankings scheduled for September 30 will not count the citations to 649 papers, mainly in particle physics, with more than 1000 authors.

This is perhaps the best that is technically and/or commercially feasible at this moment but it is far from satisfactory. Some of these publications are dealing with the most basic questions about the nature of physical reality and it is a serious distortion not to include them in the ranking methodology. There have been complaints about this. Pavel Krokovny's comment was noted in a previous post while Mete Yeyisoglu argues that:
"Fractional counting is the ultimate solution. I wish you could have worked it out to use fractional counting for the 2015-16 rankings.
The current interim approach you came up with is objectionable.
Why 1,000 authors? How was the limit set? What about 999 authored-articles?
Although the institution I work for will probably benefit from this interim approach, I think you should have kept the same old methodology until you come up with an ultimate solution.
This year's interim fluctuation will adversely affect the image of university rankings."

Baty provides a reasonable answer to the question why the cut-off point is 1,000 authors.

But there is a fundamental issue developing here that goes beyond ranking procedure. The concept of authorship of a philosophy paper written entirely by a single person or a sociological study from a small research team is very different from that of the huge multi-national capital and labour intensive publications in which the number of collaborating institutions exceeds the number of  paragraphs and there are more authors than sentences.

Fractional counting does seem to be the only fair and sensible way forward and it is now apparently on THE's agenda although they have still not committed themselves.

The objection could be raised that while the current THE system gives a huge reward to even the least significant contributing institution, fractional counting would give major research universities insufficient credit for their role in important research projects.

A long term solution might be to draw a distinction between the contributors to and the authors of the mega papers. For most publications there would be no need to draw such a distinction but for those with some sort of input from dozens, hundreds or thousands of people it might be feasible for to allot half the credit to all those who had anything to do with the project and the other half to those who meet the standard criteria of authorship. There would no doubt be a lot of politicking about who gets the credit but that would be nothing new.

Duncan Ross, the new Data and Analytics Director at THE, seems to be thinking along these lines.
"In the longer term there are one technical and one structural approach that would be viable.  The technical approach is to use a fractional counting approach (2932 authors? Well you each get 0.034% of the credit).  The structural approach is more of a long term solution: to persuade the academic community to adopt metadata that adequately explains the relationship of individuals to the paper that they are ‘authoring’.  Unfortunately I’m not holding my breath on that one."
The counting of citations to mega papers is not the only problem with the THE citations indicator. Another is the practice of giving a boost to universities in underperforming countries. Another item by Phil Baty quotes this justification from Thomson Reuters, THE's former data partner.

“The concept of the regional modification is to overcome the differences between publication and citation behaviour between different countries and regions. For example some regions will have English as their primary language and all the publications will be in English, this will give them an advantage over a region that publishes some of its papers in other languages (because non-English publications will have a limited audience of readers and therefore a limited ability to be cited). There are also factors to consider such as the size of the research network in that region, the ability of its researchers and academics to network at conferences and the local research, evaluation and funding policies that may influence publishing practice.”

THE now appear to agree that this is indefensible in the long run and hope that a more inclusive academic survey and the shift to Scopus, with broader coverage than the Web of Science, will lead to this adjustment being phased out.

It is a bit odd that TR and THE should have introduced income, in three separate indicators, and international outlook, in another three, as markers of excellence, but then included a regional modification to compensate for limited funding and international contacts.

THE are to be congratulated for having put fractional counting and phasing out the regional modification on their agenda. Let's hope it doesn't take too long.

While we are on the topic, there are some more things about the citation indicator to think about . First, to repeat a couple of points mentioned in the earlier post.

  • Reducing the number of fields or doing away with normalisation by year of citation. The more boxes into which any given citation can be dropped the greater the chance of statistical anomalies when a cluster of citations meets a low world average of citations for that particular year of citations, year of publication and field (300 in Scopus?)

  • Reducing the weighting for this indicator. Perhaps citations per paper normalized by field is a useful instrument for comparing the quality of research of MIT, Caltech, Harvard and the like but it might be of little value when comparing the research performance of Panjab University and IIT Bombay or Istanbul University and  Bogazici.

Some other things THE could think about.

  • Adding a measure of overall research impact, perhaps simply by counting citations. At the very least stop calling field- and year- normalised regionally modified citations per paper a measure of research impact. Call it research quality or something like that.

  • Doing something about secondary affiliations. So far this seems to have been a problem mainly  for the Highly Cited Researchers indicator in the Shanghai ARWU but it may not be very long before more universities realise  that a few million dollars for adjunct faculty could have a disproportion impact on publication and citation counts.

  • Also, perhaps THE should consider excluding self-citations (or even citations within the same institution although that would obviously be technically difficult). Self-citation caused a problem in 2010 when Dr El Naschie's diligent citation of himself and a few friends lifted Alexandria University to fourth place in the world for research impact. Something similar might happen again now that THE are using a larger and less selective database.


Friday, August 28, 2015

The Richest University in China ...


   ...   is Tsinghua University but Zhejiang, Peking and Shanghai Jiao Tong Universities appear to be more productive, as measured by the Publications indicator in the Shanghai rankings.

China Daily has just published a list of the top ten universities in China ranked according to annual income as reported to the Ministry of Education. Here they are with the Publications score (papers in the Science Citation Index and the Social Science Citation Index in 2014) in brackets.


1.     Tsinghua University 17.56 billion yuan (63.8)
2.     Zhejiang University 15.64 billion yuan  (68.5)
3.     Peking University 12.85 billion yuan      (64)
4.     Shanghai Jiao Tong University 11.89 billion yuan   (68.5)
5.     Fudan University 7.71 billion yuan (56.1)
6.     Wuhan University 6.83 billion yuan (45.8)
7.     Jilin University 6.82 billion yuan  (50.7)
8.     Huazhong University of Science and Technology 6.81 billion yuan  (53.1)
9.     Sun Yat-sen University 6.69 billion yuan (54.9)
10.   Sichuan University 6.58 billion yuan    (54.2).

Tuesday, August 25, 2015

Not fair to call papers freaky

A comment by Pavel Krokovny of Heidelberg University about THE's proposal to exclude papers with 1,000+ authors from their citations indicator in the World University Rankings.

"It is true that all 3k+ authors do not draft the paper together, on the contrary, only a small part of them are involved in this very final step of a giant research work leading to a sound result. It is as well true that making the research performed public and disseminating the knowledge obtained is a crucial step of the whole project. 
But what you probably missed is that this key stage would not be possible at all without a unique setup which was built and operated by profoundly more physicists and engineers than those who processed raw data and wrote a paper. Without that "hidden part of the iceberg" there would be no results at all. And it would be completely wrong to assume that the authors who did the data analysis and wrote the paper should be given the highest credit in the paper. It is very specific for the experimental HEP field that has gone far beyond the situation that was common still in the first half of 20th century when one scientist or a small group of them might produce some interesting results. The "insignificant" right tail in your distribution of papers on number of coauthors contains the hot part of the modern physics with high impact results topped by the discovery of Higgs-boson. And in your next rankings you are going to dishonour those universities that contributed to this discovery."

and


"the point is that frequent fluctuations of the ranking methodology might damage the credibility of the THE. Certainly, I do not imply here large and well-esteemed universities like Harvard or MIT. I believe their high rankings positions not to be affected by nearly any reasonable changes in the methodology. However, the highest attention to the rankings is attracted from numerous ordinary institutions across the world and their potential applicants and employees. In my opinion, these are the most concerned customers of the THE product. As I already pointed out above, it's very questionable whether participation in large HEP experiments (or genome studies) should be considered "unfair" for those institutions."

Sunday, August 23, 2015

Changes in Ranking Methodology

This year and next the international university rankings appear to be set for more volatility with unusually large upward and downward movement, partly as a result of changes to the methodology for counting citations in the QS and THE rankings.

ARWU

The global ranking season kicked off last week with the publication of the latest edition of the Academic Ranking of World Universities from the ShanghaiRanking Consultancy (SRC), which I hope to discuss in detail in a little while. These rankings are rather dull and boring, which is exactly what they should be. Harvard is, as always, number one for all but one of the indicators. Oxford has slipped from joint ninth to tenth place. Warwick has leaped into the top 100 by virtue of a Fields medal. At the foot of the table there are new contenders from France, Korea and Iran.

Since they began in 2003 the Shanghai rankings have been characterised by a  generally stable methodology. In 2012, however, they had to deal with the recruitment of a large and unprecedented number of adjunct faculty by King Abdulaziz University. Previously SRC had simply divided the credit for the Highly Cited Researchers indicator equally between all institutions listed as affiliations. In 2012 and 2013 they wrote to all highly cited researchers with joint affiliations and thus determined the division of credit between primary and secondary affiliations. Then, in 2014 and this year they combined the old Thomson Reuters list, first issued in 2001, and the new one, issued in 2014, and excluded all secondary affiliations in the new list.

The result was that in 2014 the rankings showed an unusual degree of volatility although this year things are a lot more stable. My understanding is that Shanghai will move to counting only the new list next year, again without secondary affiliations, so there should be a lot of interesting changes then. It looks as though Stanford, Princeton, University of Wisconsin -- Madison, and Kyoto University will suffer because of the change while University of California Santa Cruz, Rice University, University of Exeter and University of Wollongong. will benefit.

While SRC has efficiently dealt with the issue of secondary affiliation with regard to its Highly Cited indicator, the issue has now resurfaced in the unusual high scores achieved  by King Abdulaziz University for publications largely because of its adjunct faculty. Expect more discussion over the next year or so. It would seem sensible for SRC to think about a five or ten year period rather than one year for their Publications indicator and academic publishers, the media and rankers in general may need to give some thought to the proliferation of secondary affiliations.


QS

On July 27 Quacquarelli Symonds (QS) announced that for 18 months they had been thinking about normalising the counting of citations across five broad subject areas. They observed that a typical institution would receive about half of its citations from the life sciences and medicine, over a quarter from the natural sciences but just 1% from the arts and humanities.

In their forthcoming rankings QS will assign a 20% weighting for citations to each of the five subject areas something, according to Ben Sowter Research Director at QS, that they have been doing for the academic opinion survey.

It would seem then that there are likely to be some big rises and big falls this September. I would guess that places strong in humanities, social sciences and engineering like LSE, New York University and Nanyang Technological University may go up and some of the large US state universities and Russian institutions may go down. That's a guess because it is difficult to tell what happens with the academic and employer surveys.

QS have also made an attempt to deal with the issue of hugely cited papers with hundreds, even thousands of "authors" -- contributors would be a better term -- mainly in physics, medicine and genetics. Their approach is to exclude all papers with more than 10 contributing institutions, that is 0.34% of all publications in the database.

This is rather disappointing. Papers with huge numbers of authors and citations obviously do have distorting effects but they have often dealt with fundamental and important issues. To exclude them altogether is to ignore a very significant body of research.

The obvious solution to the problem of multi-contributor papers is fractional counting, dividing the number of citations by the number of contributors or contributing institutions. QS claim that to do so would discourage collaboration, which does not sound very plausible.

In addition, QS will likely extend the life of  survey responses from three to five years. That could make the rankings more stable by smoothing out annual fluctuations in survey responses and reduce the volatility caused by the proposed changes in the counting of citations.

The shift to a moderate version of field normalisation is helpful as it will reduce the undue privilege given to medical research, without falling into the huge problems that result from using too many categories. It is unfortunate, however, that QS have not taken the plunge into fractional counting. One suspects that technical problems and financial considerations might be as significant as the altruistic desire not to discourage collaboration.

After a resorting in September the QS rankings are likely to become a bit more stable and and credible but their most serious problem, the structure, validity and excessive weighting of the academic survey, has still not been addressed.

THE

Meanwhile, Times Higher Education (THE) has also been grappling with the issue of authorship inflation. Phil Baty has announced that this year 649 papers with over 1,000 authors will be excluded from their calculation of citations because " we consider them to be so freakish that they have the potential to distort the global scientific landscape".

But it is not the papers that do the distorting. It is  methodology.  THE and their former data partners Thomson Reuters, like QS, have avoided  fractional counting (except for a small experimental African ranking) and so every one of those hundreds or thousands of authors gets full credit for the hundreds  or thousands of citations. This has given places like Tokyo Metropolitan University, Scuola Normale Superiore Pisa, Universite Cadi Ayyad in Morocco and Bogazici University in Turkey remarkably high scores  for Citations: Research Impact, much higher than their scores for the bundled research indicators.

THE have decided to simply exclude 649 papers, 0r 0.006% of the total from their calculations for the world rankings. This seems a lot less than QS. Again, this is a rather crude measure. Many of the "freaks" are major contributions to advanced research and deserve to be acknowledged by the rankings in some way. 

THE did use fractional counting in their recent experimental ranking of African universities and Baty indicates that they are considering doing so in the future.

It would be a big step forward for THE if they introduce fractional counting of citations. But they should not stop there. There are other bugs in the citations indicator that ought to be fixed.

First, it does not at present measure what it is supposed to measure. It does not measure a university's overall research impact. At best, it is a measure of the average quality of research papers no matter how few (above a certain threshold) they are.

Second, the "regional modification", which divides the university citation impact score by the square root of the the score of the country where the university is located, is another source  of distortion. It gives a bonus to universities simply for being located in  underperforming countries. THE or TR have justified the modification by suggesting that some universities deserve compensation because they lack funding or networking opportunities. Perhaps they do, but this can still lead to serious anomalies.

Thirdly, THE need to consider whether they should assign citations to so many fields since this increases the distortions that can arise when there is a highly cited paper in a normally lowly cited field.

Fourthly, should they assign a thirty per cent weighting to an indicator that may be useful for distinguishing between the likes of MIT and Caltech but may be of little relevance for the universities that are now signing up for the world rankings?
































Sunday, August 16, 2015

Research Ranking of African Universities

I suppose it should no longer be surprising that university heads flock to Johannesburg for the unveiling of an "experimental" research ranking of 30 African universities that put the University of Port Harcourt in 6th place, did not include Cairo University, the University of Ibadan, Ahmadu Bello University or the University of Nigeria Nsukka, placed Makerere above Stellenbosch and Universite Cadi Ayyad above the University of Pretoria.

It is still a bit odd that African universities seem to have  ignored a reasonable and sensible research ranking from the Journals Consortium that I found while reading an article by Gerald Ouma in the Mail and Guardian Africa, which, by the way, had an advertisement about Universite Cadi Ayyad being number ten in the THE African ranking.

The Journals Consortium ranking is based on publications, citations and web visibility and altogether 1,447 institutions are ranked. The methodology, which is a bit thin, is here.

Here are the top ten.
1.   University of Cape Town
2.   Cairo University
3.   University of Pretoria
4.   University of Nairobi
5.   University of South Africa
6.   University of the Witwatersrand
7.   Stellenbosch University
8.   University of Ibadan
9.   University of Kwazulu-Natal
10. Ain Shams University

The University of Port Harcourt is 36th and Universite Cadi Ayyad is 89th.

I am pleased to note that two of my former employers are in the rankings, University of Maiduguri in 66th place and Umar ibn Ibrahim El-Kanemi College of Education, Science and Technology (formerly Borno College of Basic Studies) in 988th.



Friday, August 14, 2015

This is also really frightening

From The Times, which is supposed to be a really posh paper -- I remember adverts "Top People Read the Times" -- read by people with degrees from Russell Group universities:

"Of the 3 million Muslims in Britain, about 2.3 million identify as Sunni, compared with 300,000 Shias, or 5 per cent of the total. Most British Shias have roots in Iran, Iraq, Azer­baijan or ­Bahrain. Sunnis make up the vast majority of Muslims worldwide."

Thursday, August 13, 2015

This is really frightening

The evidence that human intelligence is falling continues to accumulate. PISA scores in Sweden are down and not just among immigrants. The intelligence of US marines, as measured by the General Classification Test, has been in decline since the 1980s. Based on "a small, probably representative sample" the French national IQ has dropped since 1999.

And now we have this from an article about the possible revival of the Liberal Democrats by Jon Stone, who is a reporter, in the Independent, which is a newspaper.

"Lazarus is a character in the Christian holy book The Bible who comes back from the dead after an intervention by Jesus Christ, a religious figure."

I thought the Independent was one of the posh papers read by bright people who had degrees and knew how ignorant and illiterate UKIP supporters were.

Does Jon Stone really think he has to explain to his readers what the Bible is? Or is this some sort of PC policy?

The worse thing is that he apparently doesn't know that Lazarus was really a character in a Robert Heinlein novel.









Wednesday, August 12, 2015

The Plague of Authorship Inflation

An article in the Wall Street Journal by Robert Lee Hotz describes the apparently inexorable increase in the number of authors of scientific papers.

In 2014 according to the Web of Science the number of papers with 50 or more authors reached over 1400 and the number with 500 or more was over 200. The situation is getting so bad that one journal, Nature, was unable to list all the autors of a paper in the print edition .

Hotz has an amusing digression where he recounts how scientists have listed a hamster, a dog and a computer as co-authors

One issue that he does not explore is the way in which multi-authorship has distorted global university rankings. Times Higher Education and Thomson Reuters until this year declined to use fractional counting of citations in their World University Rankings so that every one of hundreds of contributors was credited with every one of thousands of citations. When this was combined with normalisation by 250 fields so that a few citations could have a disproportionate effect and a deceptive regional modification that rewarded universities for being in a country that produced few citations then the results could be ludicrous. Unproductive institutions, for example Alexandria University, those that are very small,  for example Scuala Normale Superiore Pisa, or very specialised, for example Moscow State Engineering Physics Institute, have been presented by THE as world leaders for research  impact.

Let us hope that this indicator is reformed in the forthcoming world rankings.



Sunday, August 09, 2015

Another Ranking Indicator for Africa


The prestigious and exclusive THE African summit is over. Whether it will lead to a serious regional ranking remains to be seen. The indicators used by THE in their world rankings and various regional spin-offs seem generally inappropriate to all but about two dozen institutions: reputation for research, income in three different indicators, citations, number of doctoral students.

But there is still a need to compare and evaluate the effectiveness of African universities in providing instruction in academic, technical and professional subjects and perhaps in their participation in innovative and economically beneficial projects.

Probably the way ahead for African ranking is the use of social media, bypassing the very problematical collection of institutional data.  More of that later.

Anyway, here is a ranking of African universities according to the number of results from a search of the WIPO Patentscope site. Searching was done on the 5th and 6th of August. Universities included the top 50 African universities in Webometrics and any university in the recent THE pilot ranking. All fields were searched.

There are no real surprises. South Africa is dominant, followed by Egypt. The flagships of Uganda, Kenya, Ghana and Nigeria are represented. Most universities in Africa do no innovative research reflected in patents.




RankUniversityCountryReferences in patents
any field
1  University of Cape TownSouth Africa377
2University of PretoriaSouth Africa242
3University of the WitwatersrandSouth Africa217
4  Stellenbosch UniversitySouth Africa165
5North West University South Africa125
6Cairo UniversityEgypt100
7University of the Free StateSouth Africa72
8University of JohannesburgSouth Africa46
9University of Kwazulu-NatalSouth Africa        41
10Nelson Mandela Metropolitan University    South Africa34
11Assiut UniversityEgypt31
12Rhodes UniversitySouth Africa30
13University of NairobiKenya21
14Makerere UniversityUganda20
15University of the Western CapeSouth Africa18
16American University in  CairoEgypt17
17University of GhanaGhana
13
18Université Mohammed V SouissiMorocco12
19Cape Peninsula University of TechnologySouth Africa11
20Mansoura UniversityEgypt10
21University of NamibiaNamibia9
22Alexandria UniversityEgypt8
23University of IbadanNigeria
7
24=Kenyatta UniversityKenya6
24=University of ZimbabweZimbabwe6
24=Durban University of TechnologySouth Africa6
27=University of South AfricaSouth Africa5
27=Zagazig UniversityEgypt5
27=Suez Canal UniversityEgypt5
30=University of Dar Es SalaamTanzania4
30=Addis Ababa UniversityEthiopia4
32=University of IlorinNigeria3
32=University of KhartoumSudan3
32=University of MalawiMalawi3
35=Helwan UniversityEgypt2
35=Université Hassan II Ain ChockMorocco2
35=Université Cadi Ayyad MarrakeMorocco2
35=Kafrelsheikh UniversityEgypt2
35=University of ZambiaZambia2
35=Ahmadu Bello UniversityNigeria2
41=University of LagosNigeria
1
41=Université Cheikh Anta Diop Senegal1
41=University of MauritiusMauritius1
41=Université de Constantine 1Algeria1
41=Université  de Yaounde 1Cameroons1
46=Obafemi Awolowo UniversityNigeria0
46=Kwame Nkrumah University of Science
and Technology
Ghana0
46=University of Port HarcourtNigeria0
46=University of BotswanaBotswana0
46=Tanta UniversityEgypt0
46=Kwame Nkrumah University of Science
and Technology
Ghana0
46=University of Port HarcourtNigeria0
46=Covenant University Nigeria0
46=Bejaia universityAlgeria0
46=University of BotswanaBotswana0
46=Minia UniversityEgypt0
46=University of TunisTunisia0
46=Benha UniversityAlgeria0
46=Universidade Católica de AngolaAngola0
46=Université de LoméTogo0
46=South Valley UniversityEgypt0
46=Université Abou Bekr BelkaidAlgeria0
46=Beni-Suef universityEgypt0
46=Université Omar BongoGabon0
46=University of The GambiaGambia0
46=Université de ToliaraMadagascar0
46=Université Kasdi Merbah OuargiaAlgeria0
46=Universite de la ReunionReunion0
46=Universidade Eduardo MondlaneMozambique0
46=Université de OuagadougouBurkina Faso0
46=University of RwandaRwanda0
46=Universite de BamakoMali0
46=University of SwazilandSwaziland0
46=Université Félix Houphouët-BoignyIvory Coast
0
46=Université de KinshasaDemocratic republic of the Congo0
46=National University of LesothoLesotho0
46=Universidade Jean Piaget
de Cabo Verde
Cape Verde
0
46=National Engineering School of SfaxTunisia0
46=Université Marien NgouabiCongo republic0
46=University of LiberiaLiberia0
46=Université Djillali LiabesAlgeria0
46=Université Abdou Moumouni de NiameyNiger0
46=Misurata UniversityEgypt0
46=Université de DschangCameroons0
46=Université de BanguiCentral African Republic0
46=Université de NouakchottMauretania0
46=Eritrea Iinstitute of TechnologyEritrea0
46=Université de DjiboutiDjibout0
46=University of SeychellesSeychelles0
46=Mogadishu UniversitySomalia0
46=Universidad Nacional de
Guinea Ecuatorial
Equatorial Guinea0
46=Universite Gamal Abdel Nasser
de Conakry
Guinea0
46=University of MakeniSierria Leone0
46=John Garang Memorial UniversitySouth Sudan0
46=Hope African UniversityBurundi0
46=Universite de MoundouChad0

The Onion Analyses the US News Rankings

Just an extract. The whole thing is here.

  • Step 1: Schools are weighed on a scale
  • Step 2: Researchers calculate each campus’ student-to-student ratio
  • Step 3: Any college whose colors are maroon and gold is immediately eliminated

Friday, August 07, 2015

Error announcement from CWTS Leiden Ranking


See here for an error announcement from CWTS Leiden Ranking.

The prompt disclosure of the error adds to the credibility of the rankings.

Wednesday, August 05, 2015

Japan Targets the THE Rankings



The Wall Street Journal has an article about the proposed transformation of Japanese higher education. The national government is apparently using financial pressure to persuade universities to either become world class or local industry-orientated institutions.

World class means being in the top 100 of the THE world university rankings. Japan wants to have ten there. Now it only has two.

It is not a good idea to focus on just one ranking. If the Japanese government insists on aiming at just one then  THE might not be the best bet. The Shanghai rankings are more stable, reliable and transparent. In addition, it seems that the THE rankings are biased against Japan.

Tuesday, August 04, 2015

Degree Class is no Longer Important

The relentless grade inflation in British secondary and tertiary education has been well documented. A first or upper second class degree or a grade A at A level no longer means very much. It has  been far too easy for universities to cover up their deficiencies or attract applications by handing out firsts or upper seconds like smarties.

Now the consequences are beginning to become apparent. Ernst and Young (EY), the accounting firm, will no longer require applicants to have an upper second or three grade Bs. Instead they will use "numerical tests" and "strength assessments" to assess applicants.

I suspect that in a little while EY will come under fire for not recruiting enough of those groups that do not test well, especially in quantitative skills. They and others will probably join in the hunt for the Holy Grail of modern social science, the factor that is non-cognitive but still significantly predictive of career success.

Meanwhile, universities will try to find new functions to replace their historical role as the guarantor of a certain level of cognitive ability. Expect to see more conversations about assessing civic engagement or reaching out to communities.

Monday, August 03, 2015

The CWUR Rankings 2015

The Center for World University Rankings, based in Jeddah, Saudi Arabia, has produced the latest edition of its global ranking of 1,000 universities.  The Center is headed by Nadim Mahassen, an Assistant Professor at King Abdulaziz University.

The rankings include five indicators that measure various aspects of publication and research: publications in "reputable journals", research papers in "highly influential" journals, citations, h-index and patents.

These indicators are given a combined weighting of 25%.

Another 25% goes to Quality of Education, which is measured by the number of alumni receiving major international awards relative to size (current number of students according to national agencies). This is obviously a  crude measure which fails to distinguish among the great mass of universities that have never won an award.

Similarly, another quarter is assigned to Quality of Faculty measured by the number of faculty receiving such awards and another quarter to Alumni Employment measured by the number of CEOs of top corporations. Again, these indicators are of little or no relevance to all but a few hundred institutions.

Alumni employment gets another 25%. This is measured by alumni holding CEO positions in top companies. Again, this would be of relevance to a limited number of universities.

The Top Ten are:

1.    Harvard
2.    Stanford
3.    MIT
4.    Cambridge
5.    Oxford
6.    Columbia
7.    Berkeley
8.    Chicago
9.    Princeton
10.  Cornell.

The only change from last year is that Cornell has replaced Yale in tenth place.


Countries with Universities in the Top Hundred in 2015 and 2014


CountryUniversities in
 top 100 2015  
2014
US 5553          
UK77
Japan78
Switzerland  44
France44
Canada33
Israel33
South Korea21
Germany24
Australia2 2
China22
Netherlands21
Russia11
Taiwan11
Belgium11
Norway10
Sweden12
Singapore11
Denmark11
Italy              0                       1            


Top Ranked in Region or Country

USA:                                       Harvard
Canada:                                  Toronto
Asia:                                       Tokyo
South Asia:                             IIT Delhi
Southeast Asia :                      National University of Singapore
Europe:                                   Cambridge
Central and Eastern Europe:    Lomonosov Moscow State University
Arab World:                             King Saud University
Middle East:                             Hebrew University of Jerusalem
Latin America:                         Sao Paulo
Africa:                                      University of the Witwatersrand
Carribbean:                              University of Puerto Rico at Mayagüez



Noise Index


In the top 20, the CWUR rankings are more stable than THE and QS but less stable than the Shanghai rankings.

Average position change of universities in the top 20 in 2014: 0.5

Comparison

CWUR 2013-14:            0.9
Shanghai Rankings (ARWU)
2011-12:                        0.15
2012-13:                        0.25
THE WUR  2012-13:      1.2
QS  WUR    2012-13      1.7

With regard to the top 100, the CWUR rankings are more stable this year, with a volatility similar to the QS and THE rankings although significantly less so than ARWU.



Average position change of universities in the top 100 in 2014: 4.15

Comparison

CWUR 2013-14:           10.59

Shanghai Rankings (ARWU

2011-12:                           2.01
2012-13:                           1.66
THE WUR  2012-13:         5.36
QS  WUR    2012-13:    -   3.97


Saturday, August 01, 2015

The Other THE African University Rankings


THE have just presented Africa and the world with a list of 30 African universities ranked according to "research impact", that is the number of citations per paper normalised by field (300 of them?) and year. Citations are not just counted but compared with the world average for specific years and fields.

The result is that a university that manages to join a large international project, typically in medicine, genetics or particle physics with a disproportionate number of citations especially in the first couple of years of publication, can get an extremely high score. If the university had few publications to begin with the score for this indicator would be even higher.

For these rankings THE have introduced fractionalised counting. so that a university that is one of 100 contributors to a project with 2000 citations would get the equivalent of 20 citations. Under the procedure THE and the their former data collectors Thomson Reuters had been using for the world university rankings it would have been credited with 2000 citations as would all the other contributors.

THE are to be congratulated for finally using fractionalised counting which has reduced the likelihood of the indicator producing very odd results. Even so, the snapshot ranking is inappropriate for African universities, as it still privileges those that happen to contribute to to a few international projects.

The results might seem acceptable to THE and its international audiencebut I suspect that Egyptian academics will be amused by a ranking that includes six universities but not Cairo University . I wonder how many Nigerians will accept a ranking that includes Port Harcourt but not Ibadan or Ahmadu Bello.

Along with standardised scores for citations THE has also included the number of publications in the Scopus database between 2009 and 2013. This, a measurement of research output of a fairly high quality, is probably more relevant to Africa than the citations indicator. Unfortunately, it shows the very limited amount of research done between the Sahara and the Kalahari and so would be inexpedient to present as a snapshot of what a future ranking might look like.

The methods, approaches and assumptions of THE's world rankings with their emphasis on inputs, especially income, research quality, inappropriately called research impact or research influence, reputation, and doctoral education are of limited value to all but a few African universities and stakeholders. Whether anything of value comes from the conversation in Johannesburg remains to be seen but it is unlikely that a modified version of the world rankings will be of much value to anyone.

Anyway, below are the 30 African universities reordered according to number of publications.








RankUniversityCountrynumber of
 publications 2009-2013
1  University of Cape TownSouth Africa5540.21
2University of PretoriaSouth Africa4544.33
3University of the WitwatersrandSouth Africa4387.17
4  Stellenbosch UniversitySouth Africa4357.33
5University of Kwazulu-NatalSouth Africa4235.09
6Alexandria UniversityEgypt2550.15
7Universite de SfaxTunisia2355.30
8University of JohannesburgSouth Africa2192.74
9North West UniversitySouth Africa1707.94
10Assiut UniversityEgypt1588.64
11University of the Free StateSouth Africa1512.56
12Université Mohammed V – AgdalMorocco1503.69
13Rhodes UniversitySouth Africa1296.96
14University of the Western CapeSouth Africa1154.77
15Makerere UniversityUganda1112.69
16Suez Canal UniversityEgypt998.98
17University of South AfricaSouth Africa981.67
18Nelson Mandela Metropolitan UniversitySouth Africa885.77
19Universite Hassan II CasablancaMorocco960.25
20Universite Cadi AyyadMorocco 910.82
21Addis Ababa UniversityEthiopia893.90
22Univerite de TunisTunisia879.63
23Universite de Yaounde ICameroons876.33
24Ecole Nationale d’Ingénieurs de SfaxTunisia822.31
25University of GhanaGhana804.53
26American University in CairoEgypt700.89
27Minia UniversityEgypt694.79
28University of NairobiKenya671.72
29South Valley UniversityEgypt636.85
30University of Port HarcourtNigeria573.55