Sunday, September 06, 2015

More on Alternative Indicators for Ranking African Universities


Continuing with our exploration of how to rank universities outside the world's top 200 or 400 where it is necessary to develop robust and sophisticated techniques of standardisation, normalisation, scaling, regional modification, taking away the number you first thought of (just kidding) verification, weighting and validation to figure out that Caltech's  normalised research impact is slightly better than Harvard's or that Cambridge is a bit more international than that place in the other Cambridge, here is a ranking of African universities according to recommendations in LinkedIn.

There are obvious problems with this indicator, not least of which is the tiny number of responses compared to all the students on the continent. It might, however, be the precursor to a useful survey of student opinion or graduate employability later on.

First place goes to the University of South Africa, an open distance education institution whose alumni include Nelson Mandela, Cyril Ramaphosa and Jean-Bertrand Aristide. Makerere University, the University of Nairobi and Kenyatta University do well.

Data was compiled on the 28th and 29th of July. All universities included in the THE experimental African ranking, the top fifty African universities in Webometrics plus the top universities in Webometrics or 4icu of any country still nor included.


RankUniversityCountryNumber of LinkedIn Recommendations
1  University of South AfricaSouth Africa 154
2Makerere University Uganda116
3University of the Witwatersrand        South Africa94
4  University of IbadanNigeria86
5University of JohannesburgSouth Africa79
6University of NairobiKenya75
7Cairo UniversityEgypt67
8Stellenbosch UniversitySouth Africa63
9University of PretoriaSouth Africa      62
10Kenyatta UniversityKenya61
11University of Cape TownSouth Africa60
12University of LagosNigeria58
13Addis Ababa UniversityEthiopia55
14Obafemi Awolowo UniversityNigeria50
15Alexandria UniversityEgypt47
16Rhodes UniversitySouth Africa42
17Jomo Kenyatta University of
Agriculture and Technology
Kenya
40
18American University in CairoEgypt28
19University of Kwazulu-NatalSouth Africa26
20University of IlorinNigeria24
21University of ZimbabweZimbabwe22
22Kwame Nkrumah University of Science and TechnologyGhana21
23Helwan UniversityEgypt
20
24=North West UniversitySouth Africa18
24=University of GhanaGhana18
24=University of Port HarcourtNigeria18
27=Durban University of TechnologySouth Africa16
27=University of Dar Es SalaamTanzania16
29=Nelson Mandela Metropolitan
University
South Africa14
29=University of the Western CapeSouth Africa14
31Cape Peninsula University of
Technology
South Africa13
32Mansoura universityEgypt12
33University of BotswanaBotswana10
34Covenant UniversityNigeria9
35=Zagazig UniversityEgypt7
35=Suez Canal universityEgypt7
37Tanta UniversityEgypt6
38=Assiut UniversityEgypt5
38=Université Constantine 1 Algeria5
40=University of the Free StateSouth Africa4
40=Universite des Sciences et de la
Technologie Houari Boumediene
Algeria
4
42+South Valley UniversityEgypt3
42+Université Cadi Ayyad Morocco2
42+University ofTunisTunisia2
42+University of NamibiaNamibia1
42+University of MauritiusMauritius1
42+Université Cheikh Anta Diop Senegal0
42+Université Mohammed V SouissiMorocco0
42+University of KhartoumSudan0
42+University of MalawiMalawi0
42+Université Hassan II Ain ChockMorocco0
42+Kafrelsheikh UniversityEgypt0
42+University of ZambiaZambia0
42+Bejaia universityAlgeria0
42+Minia UniversityEgypt0
42+Benha UniversityEgypt0
42+Universidade Católica de AngolaAngola0
42+Université de LoméTogo0
42+Université Abou Bekr BelkaidAlgeria0
42+Beni-Suef UniversityEgypt0
42+Université Omar BongoGabon0
42+University of the GambiaGambia0
42+Université de ToliaraMadagascar0
42+Université Kasdi Merbah OuargAlgeria0
42+Universite de la ReunionReunion0
42+Université d'Abomey-CalaviBenin0
42+Universidade Eduardo MondlaneMozambique0
42+Université de OuagadougouBurkina Faso0
42+University of RwandaRwanda0
42+Universite de BamakoMali0
42+University of SwazilandSwaziland0
42+Université Félix Houphouët-BoignyIvory Coast0
42+Université de KinshasaDemocratic Republic of the Congo0
42+National University of LesothoLesotho
0
42+Universidade Jean Piaget de Cabo VerdeCape Verde0
42+N Engineering S of SfaxTunisia 0
42+Université Marien NgouabiRepublic of the Congo
0
42+University of LiberiaLiberia0
42+Université Djillali LiabesAlgeria0
42+Université Abdou Moumouni de NiameyNiger0
42+Misurata UniversityLibya0
42+Université de DschangCameroons0
42+Université de BanguiCentral African Republic0
42+Université de NouakchottMauritania0
42+Eritrea Institute of TechnologyEritrea0
42+Université de DjiboutiDjibouti0
42+University of SeychellesSeychelles0
42+Mogadishu UniversitySomalia0
42+Universidad Nacional de Guinea Ecuatorial Equatorial Guinea0
42+Universite Gamal Abdel Nasser de ConakryGuinea0
42+University of MakeniSierra Leone0
42+John Garang Memorial UniversitySouth Sudan0
42+Hope Africa UniversityBurundi0
42+Universite de MoundouChad0
42+Universite de Yaounde ICameroons0




Tuesday, September 01, 2015

Best German and Austrian Universities if you Want to get Rich

If you want to go a university in Germany or Austria and get rich afterwards, the website Wealth-X has a ranking for you. It counts the number of UHNW (ultra high net worth) alumni, those with US$ 30 million or above.

Here are the top five with the number of UHNW individuals in brackets.

1. University of Cologne     (18)
2. University of Munich      (14)
3. University of Hamburg    (13)
4. University of Freiburg     (11)
5. University of Bonn          (11)

There may well be protests about who should be first. In tenth place is "Ludwig Maximilians University Munich (LMU Munich)", which I assume is another name for the University of Munich, with six UHNW alumni .

Monday, August 31, 2015

Update on changes in ranking methodology

Times Higher Education (THE) have been preparing the ground for methodological changes in their world rankings. A recent article by Phil Baty  announced that the new world rankings scheduled for September 30 will not count the citations to 649 papers, mainly in particle physics, with more than 1000 authors.

This is perhaps the best that is technically and/or commercially feasible at this moment but it is far from satisfactory. Some of these publications are dealing with the most basic questions about the nature of physical reality and it is a serious distortion not to include them in the ranking methodology. There have been complaints about this. Pavel Krokovny's comment was noted in a previous post while Mete Yeyisoglu argues that:
"Fractional counting is the ultimate solution. I wish you could have worked it out to use fractional counting for the 2015-16 rankings.
The current interim approach you came up with is objectionable.
Why 1,000 authors? How was the limit set? What about 999 authored-articles?
Although the institution I work for will probably benefit from this interim approach, I think you should have kept the same old methodology until you come up with an ultimate solution.
This year's interim fluctuation will adversely affect the image of university rankings."

Baty provides a reasonable answer to the question why the cut-off point is 1,000 authors.

But there is a fundamental issue developing here that goes beyond ranking procedure. The concept of authorship of a philosophy paper written entirely by a single person or a sociological study from a small research team is very different from that of the huge multi-national capital and labour intensive publications in which the number of collaborating institutions exceeds the number of  paragraphs and there are more authors than sentences.

Fractional counting does seem to be the only fair and sensible way forward and it is now apparently on THE's agenda although they have still not committed themselves.

The objection could be raised that while the current THE system gives a huge reward to even the least significant contributing institution, fractional counting would give major research universities insufficient credit for their role in important research projects.

A long term solution might be to draw a distinction between the contributors to and the authors of the mega papers. For most publications there would be no need to draw such a distinction but for those with some sort of input from dozens, hundreds or thousands of people it might be feasible for to allot half the credit to all those who had anything to do with the project and the other half to those who meet the standard criteria of authorship. There would no doubt be a lot of politicking about who gets the credit but that would be nothing new.

Duncan Ross, the new Data and Analytics Director at THE, seems to be thinking along these lines.
"In the longer term there are one technical and one structural approach that would be viable.  The technical approach is to use a fractional counting approach (2932 authors? Well you each get 0.034% of the credit).  The structural approach is more of a long term solution: to persuade the academic community to adopt metadata that adequately explains the relationship of individuals to the paper that they are ‘authoring’.  Unfortunately I’m not holding my breath on that one."
The counting of citations to mega papers is not the only problem with the THE citations indicator. Another is the practice of giving a boost to universities in underperforming countries. Another item by Phil Baty quotes this justification from Thomson Reuters, THE's former data partner.

“The concept of the regional modification is to overcome the differences between publication and citation behaviour between different countries and regions. For example some regions will have English as their primary language and all the publications will be in English, this will give them an advantage over a region that publishes some of its papers in other languages (because non-English publications will have a limited audience of readers and therefore a limited ability to be cited). There are also factors to consider such as the size of the research network in that region, the ability of its researchers and academics to network at conferences and the local research, evaluation and funding policies that may influence publishing practice.”

THE now appear to agree that this is indefensible in the long run and hope that a more inclusive academic survey and the shift to Scopus, with broader coverage than the Web of Science, will lead to this adjustment being phased out.

It is a bit odd that TR and THE should have introduced income, in three separate indicators, and international outlook, in another three, as markers of excellence, but then included a regional modification to compensate for limited funding and international contacts.

THE are to be congratulated for having put fractional counting and phasing out the regional modification on their agenda. Let's hope it doesn't take too long.

While we are on the topic, there are some more things about the citation indicator to think about . First, to repeat a couple of points mentioned in the earlier post.

  • Reducing the number of fields or doing away with normalisation by year of citation. The more boxes into which any given citation can be dropped the greater the chance of statistical anomalies when a cluster of citations meets a low world average of citations for that particular year of citations, year of publication and field (300 in Scopus?)

  • Reducing the weighting for this indicator. Perhaps citations per paper normalized by field is a useful instrument for comparing the quality of research of MIT, Caltech, Harvard and the like but it might be of little value when comparing the research performance of Panjab University and IIT Bombay or Istanbul University and  Bogazici.

Some other things THE could think about.

  • Adding a measure of overall research impact, perhaps simply by counting citations. At the very least stop calling field- and year- normalised regionally modified citations per paper a measure of research impact. Call it research quality or something like that.

  • Doing something about secondary affiliations. So far this seems to have been a problem mainly  for the Highly Cited Researchers indicator in the Shanghai ARWU but it may not be very long before more universities realise  that a few million dollars for adjunct faculty could have a disproportion impact on publication and citation counts.

  • Also, perhaps THE should consider excluding self-citations (or even citations within the same institution although that would obviously be technically difficult). Self-citation caused a problem in 2010 when Dr El Naschie's diligent citation of himself and a few friends lifted Alexandria University to fourth place in the world for research impact. Something similar might happen again now that THE are using a larger and less selective database.


Friday, August 28, 2015

The Richest University in China ...


   ...   is Tsinghua University but Zhejiang, Peking and Shanghai Jiao Tong Universities appear to be more productive, as measured by the Publications indicator in the Shanghai rankings.

China Daily has just published a list of the top ten universities in China ranked according to annual income as reported to the Ministry of Education. Here they are with the Publications score (papers in the Science Citation Index and the Social Science Citation Index in 2014) in brackets.


1.     Tsinghua University 17.56 billion yuan (63.8)
2.     Zhejiang University 15.64 billion yuan  (68.5)
3.     Peking University 12.85 billion yuan      (64)
4.     Shanghai Jiao Tong University 11.89 billion yuan   (68.5)
5.     Fudan University 7.71 billion yuan (56.1)
6.     Wuhan University 6.83 billion yuan (45.8)
7.     Jilin University 6.82 billion yuan  (50.7)
8.     Huazhong University of Science and Technology 6.81 billion yuan  (53.1)
9.     Sun Yat-sen University 6.69 billion yuan (54.9)
10.   Sichuan University 6.58 billion yuan    (54.2).

Tuesday, August 25, 2015

Not fair to call papers freaky

A comment by Pavel Krokovny of Heidelberg University about THE's proposal to exclude papers with 1,000+ authors from their citations indicator in the World University Rankings.

"It is true that all 3k+ authors do not draft the paper together, on the contrary, only a small part of them are involved in this very final step of a giant research work leading to a sound result. It is as well true that making the research performed public and disseminating the knowledge obtained is a crucial step of the whole project. 
But what you probably missed is that this key stage would not be possible at all without a unique setup which was built and operated by profoundly more physicists and engineers than those who processed raw data and wrote a paper. Without that "hidden part of the iceberg" there would be no results at all. And it would be completely wrong to assume that the authors who did the data analysis and wrote the paper should be given the highest credit in the paper. It is very specific for the experimental HEP field that has gone far beyond the situation that was common still in the first half of 20th century when one scientist or a small group of them might produce some interesting results. The "insignificant" right tail in your distribution of papers on number of coauthors contains the hot part of the modern physics with high impact results topped by the discovery of Higgs-boson. And in your next rankings you are going to dishonour those universities that contributed to this discovery."

and


"the point is that frequent fluctuations of the ranking methodology might damage the credibility of the THE. Certainly, I do not imply here large and well-esteemed universities like Harvard or MIT. I believe their high rankings positions not to be affected by nearly any reasonable changes in the methodology. However, the highest attention to the rankings is attracted from numerous ordinary institutions across the world and their potential applicants and employees. In my opinion, these are the most concerned customers of the THE product. As I already pointed out above, it's very questionable whether participation in large HEP experiments (or genome studies) should be considered "unfair" for those institutions."

Sunday, August 23, 2015

Changes in Ranking Methodology

This year and next the international university rankings appear to be set for more volatility with unusually large upward and downward movement, partly as a result of changes to the methodology for counting citations in the QS and THE rankings.

ARWU

The global ranking season kicked off last week with the publication of the latest edition of the Academic Ranking of World Universities from the ShanghaiRanking Consultancy (SRC), which I hope to discuss in detail in a little while. These rankings are rather dull and boring, which is exactly what they should be. Harvard is, as always, number one for all but one of the indicators. Oxford has slipped from joint ninth to tenth place. Warwick has leaped into the top 100 by virtue of a Fields medal. At the foot of the table there are new contenders from France, Korea and Iran.

Since they began in 2003 the Shanghai rankings have been characterised by a  generally stable methodology. In 2012, however, they had to deal with the recruitment of a large and unprecedented number of adjunct faculty by King Abdulaziz University. Previously SRC had simply divided the credit for the Highly Cited Researchers indicator equally between all institutions listed as affiliations. In 2012 and 2013 they wrote to all highly cited researchers with joint affiliations and thus determined the division of credit between primary and secondary affiliations. Then, in 2014 and this year they combined the old Thomson Reuters list, first issued in 2001, and the new one, issued in 2014, and excluded all secondary affiliations in the new list.

The result was that in 2014 the rankings showed an unusual degree of volatility although this year things are a lot more stable. My understanding is that Shanghai will move to counting only the new list next year, again without secondary affiliations, so there should be a lot of interesting changes then. It looks as though Stanford, Princeton, University of Wisconsin -- Madison, and Kyoto University will suffer because of the change while University of California Santa Cruz, Rice University, University of Exeter and University of Wollongong. will benefit.

While SRC has efficiently dealt with the issue of secondary affiliation with regard to its Highly Cited indicator, the issue has now resurfaced in the unusual high scores achieved  by King Abdulaziz University for publications largely because of its adjunct faculty. Expect more discussion over the next year or so. It would seem sensible for SRC to think about a five or ten year period rather than one year for their Publications indicator and academic publishers, the media and rankers in general may need to give some thought to the proliferation of secondary affiliations.


QS

On July 27 Quacquarelli Symonds (QS) announced that for 18 months they had been thinking about normalising the counting of citations across five broad subject areas. They observed that a typical institution would receive about half of its citations from the life sciences and medicine, over a quarter from the natural sciences but just 1% from the arts and humanities.

In their forthcoming rankings QS will assign a 20% weighting for citations to each of the five subject areas something, according to Ben Sowter Research Director at QS, that they have been doing for the academic opinion survey.

It would seem then that there are likely to be some big rises and big falls this September. I would guess that places strong in humanities, social sciences and engineering like LSE, New York University and Nanyang Technological University may go up and some of the large US state universities and Russian institutions may go down. That's a guess because it is difficult to tell what happens with the academic and employer surveys.

QS have also made an attempt to deal with the issue of hugely cited papers with hundreds, even thousands of "authors" -- contributors would be a better term -- mainly in physics, medicine and genetics. Their approach is to exclude all papers with more than 10 contributing institutions, that is 0.34% of all publications in the database.

This is rather disappointing. Papers with huge numbers of authors and citations obviously do have distorting effects but they have often dealt with fundamental and important issues. To exclude them altogether is to ignore a very significant body of research.

The obvious solution to the problem of multi-contributor papers is fractional counting, dividing the number of citations by the number of contributors or contributing institutions. QS claim that to do so would discourage collaboration, which does not sound very plausible.

In addition, QS will likely extend the life of  survey responses from three to five years. That could make the rankings more stable by smoothing out annual fluctuations in survey responses and reduce the volatility caused by the proposed changes in the counting of citations.

The shift to a moderate version of field normalisation is helpful as it will reduce the undue privilege given to medical research, without falling into the huge problems that result from using too many categories. It is unfortunate, however, that QS have not taken the plunge into fractional counting. One suspects that technical problems and financial considerations might be as significant as the altruistic desire not to discourage collaboration.

After a resorting in September the QS rankings are likely to become a bit more stable and and credible but their most serious problem, the structure, validity and excessive weighting of the academic survey, has still not been addressed.

THE

Meanwhile, Times Higher Education (THE) has also been grappling with the issue of authorship inflation. Phil Baty has announced that this year 649 papers with over 1,000 authors will be excluded from their calculation of citations because " we consider them to be so freakish that they have the potential to distort the global scientific landscape".

But it is not the papers that do the distorting. It is  methodology.  THE and their former data partners Thomson Reuters, like QS, have avoided  fractional counting (except for a small experimental African ranking) and so every one of those hundreds or thousands of authors gets full credit for the hundreds  or thousands of citations. This has given places like Tokyo Metropolitan University, Scuola Normale Superiore Pisa, Universite Cadi Ayyad in Morocco and Bogazici University in Turkey remarkably high scores  for Citations: Research Impact, much higher than their scores for the bundled research indicators.

THE have decided to simply exclude 649 papers, 0r 0.006% of the total from their calculations for the world rankings. This seems a lot less than QS. Again, this is a rather crude measure. Many of the "freaks" are major contributions to advanced research and deserve to be acknowledged by the rankings in some way. 

THE did use fractional counting in their recent experimental ranking of African universities and Baty indicates that they are considering doing so in the future.

It would be a big step forward for THE if they introduce fractional counting of citations. But they should not stop there. There are other bugs in the citations indicator that ought to be fixed.

First, it does not at present measure what it is supposed to measure. It does not measure a university's overall research impact. At best, it is a measure of the average quality of research papers no matter how few (above a certain threshold) they are.

Second, the "regional modification", which divides the university citation impact score by the square root of the the score of the country where the university is located, is another source  of distortion. It gives a bonus to universities simply for being located in  underperforming countries. THE or TR have justified the modification by suggesting that some universities deserve compensation because they lack funding or networking opportunities. Perhaps they do, but this can still lead to serious anomalies.

Thirdly, THE need to consider whether they should assign citations to so many fields since this increases the distortions that can arise when there is a highly cited paper in a normally lowly cited field.

Fourthly, should they assign a thirty per cent weighting to an indicator that may be useful for distinguishing between the likes of MIT and Caltech but may be of little relevance for the universities that are now signing up for the world rankings?
































Sunday, August 16, 2015

Research Ranking of African Universities

I suppose it should no longer be surprising that university heads flock to Johannesburg for the unveiling of an "experimental" research ranking of 30 African universities that put the University of Port Harcourt in 6th place, did not include Cairo University, the University of Ibadan, Ahmadu Bello University or the University of Nigeria Nsukka, placed Makerere above Stellenbosch and Universite Cadi Ayyad above the University of Pretoria.

It is still a bit odd that African universities seem to have  ignored a reasonable and sensible research ranking from the Journals Consortium that I found while reading an article by Gerald Ouma in the Mail and Guardian Africa, which, by the way, had an advertisement about Universite Cadi Ayyad being number ten in the THE African ranking.

The Journals Consortium ranking is based on publications, citations and web visibility and altogether 1,447 institutions are ranked. The methodology, which is a bit thin, is here.

Here are the top ten.
1.   University of Cape Town
2.   Cairo University
3.   University of Pretoria
4.   University of Nairobi
5.   University of South Africa
6.   University of the Witwatersrand
7.   Stellenbosch University
8.   University of Ibadan
9.   University of Kwazulu-Natal
10. Ain Shams University

The University of Port Harcourt is 36th and Universite Cadi Ayyad is 89th.

I am pleased to note that two of my former employers are in the rankings, University of Maiduguri in 66th place and Umar ibn Ibrahim El-Kanemi College of Education, Science and Technology (formerly Borno College of Basic Studies) in 988th.



Friday, August 14, 2015

This is also really frightening

From The Times, which is supposed to be a really posh paper -- I remember adverts "Top People Read the Times" -- read by people with degrees from Russell Group universities:

"Of the 3 million Muslims in Britain, about 2.3 million identify as Sunni, compared with 300,000 Shias, or 5 per cent of the total. Most British Shias have roots in Iran, Iraq, Azer­baijan or ­Bahrain. Sunnis make up the vast majority of Muslims worldwide."

Thursday, August 13, 2015

This is really frightening

The evidence that human intelligence is falling continues to accumulate. PISA scores in Sweden are down and not just among immigrants. The intelligence of US marines, as measured by the General Classification Test, has been in decline since the 1980s. Based on "a small, probably representative sample" the French national IQ has dropped since 1999.

And now we have this from an article about the possible revival of the Liberal Democrats by Jon Stone, who is a reporter, in the Independent, which is a newspaper.

"Lazarus is a character in the Christian holy book The Bible who comes back from the dead after an intervention by Jesus Christ, a religious figure."

I thought the Independent was one of the posh papers read by bright people who had degrees and knew how ignorant and illiterate UKIP supporters were.

Does Jon Stone really think he has to explain to his readers what the Bible is? Or is this some sort of PC policy?

The worse thing is that he apparently doesn't know that Lazarus was really a character in a Robert Heinlein novel.









Wednesday, August 12, 2015

The Plague of Authorship Inflation

An article in the Wall Street Journal by Robert Lee Hotz describes the apparently inexorable increase in the number of authors of scientific papers.

In 2014 according to the Web of Science the number of papers with 50 or more authors reached over 1400 and the number with 500 or more was over 200. The situation is getting so bad that one journal, Nature, was unable to list all the autors of a paper in the print edition .

Hotz has an amusing digression where he recounts how scientists have listed a hamster, a dog and a computer as co-authors

One issue that he does not explore is the way in which multi-authorship has distorted global university rankings. Times Higher Education and Thomson Reuters until this year declined to use fractional counting of citations in their World University Rankings so that every one of hundreds of contributors was credited with every one of thousands of citations. When this was combined with normalisation by 250 fields so that a few citations could have a disproportionate effect and a deceptive regional modification that rewarded universities for being in a country that produced few citations then the results could be ludicrous. Unproductive institutions, for example Alexandria University, those that are very small,  for example Scuala Normale Superiore Pisa, or very specialised, for example Moscow State Engineering Physics Institute, have been presented by THE as world leaders for research  impact.

Let us hope that this indicator is reformed in the forthcoming world rankings.



Sunday, August 09, 2015

Another Ranking Indicator for Africa


The prestigious and exclusive THE African summit is over. Whether it will lead to a serious regional ranking remains to be seen. The indicators used by THE in their world rankings and various regional spin-offs seem generally inappropriate to all but about two dozen institutions: reputation for research, income in three different indicators, citations, number of doctoral students.

But there is still a need to compare and evaluate the effectiveness of African universities in providing instruction in academic, technical and professional subjects and perhaps in their participation in innovative and economically beneficial projects.

Probably the way ahead for African ranking is the use of social media, bypassing the very problematical collection of institutional data.  More of that later.

Anyway, here is a ranking of African universities according to the number of results from a search of the WIPO Patentscope site. Searching was done on the 5th and 6th of August. Universities included the top 50 African universities in Webometrics and any university in the recent THE pilot ranking. All fields were searched.

There are no real surprises. South Africa is dominant, followed by Egypt. The flagships of Uganda, Kenya, Ghana and Nigeria are represented. Most universities in Africa do no innovative research reflected in patents.




RankUniversityCountryReferences in patents
any field
1  University of Cape TownSouth Africa377
2University of PretoriaSouth Africa242
3University of the WitwatersrandSouth Africa217
4  Stellenbosch UniversitySouth Africa165
5North West University South Africa125
6Cairo UniversityEgypt100
7University of the Free StateSouth Africa72
8University of JohannesburgSouth Africa46
9University of Kwazulu-NatalSouth Africa        41
10Nelson Mandela Metropolitan University    South Africa34
11Assiut UniversityEgypt31
12Rhodes UniversitySouth Africa30
13University of NairobiKenya21
14Makerere UniversityUganda20
15University of the Western CapeSouth Africa18
16American University in  CairoEgypt17
17University of GhanaGhana
13
18Université Mohammed V SouissiMorocco12
19Cape Peninsula University of TechnologySouth Africa11
20Mansoura UniversityEgypt10
21University of NamibiaNamibia9
22Alexandria UniversityEgypt8
23University of IbadanNigeria
7
24=Kenyatta UniversityKenya6
24=University of ZimbabweZimbabwe6
24=Durban University of TechnologySouth Africa6
27=University of South AfricaSouth Africa5
27=Zagazig UniversityEgypt5
27=Suez Canal UniversityEgypt5
30=University of Dar Es SalaamTanzania4
30=Addis Ababa UniversityEthiopia4
32=University of IlorinNigeria3
32=University of KhartoumSudan3
32=University of MalawiMalawi3
35=Helwan UniversityEgypt2
35=Université Hassan II Ain ChockMorocco2
35=Université Cadi Ayyad MarrakeMorocco2
35=Kafrelsheikh UniversityEgypt2
35=University of ZambiaZambia2
35=Ahmadu Bello UniversityNigeria2
41=University of LagosNigeria
1
41=Université Cheikh Anta Diop Senegal1
41=University of MauritiusMauritius1
41=Université de Constantine 1Algeria1
41=Université  de Yaounde 1Cameroons1
46=Obafemi Awolowo UniversityNigeria0
46=Kwame Nkrumah University of Science
and Technology
Ghana0
46=University of Port HarcourtNigeria0
46=University of BotswanaBotswana0
46=Tanta UniversityEgypt0
46=Kwame Nkrumah University of Science
and Technology
Ghana0
46=University of Port HarcourtNigeria0
46=Covenant University Nigeria0
46=Bejaia universityAlgeria0
46=University of BotswanaBotswana0
46=Minia UniversityEgypt0
46=University of TunisTunisia0
46=Benha UniversityAlgeria0
46=Universidade Católica de AngolaAngola0
46=Université de LoméTogo0
46=South Valley UniversityEgypt0
46=Université Abou Bekr BelkaidAlgeria0
46=Beni-Suef universityEgypt0
46=Université Omar BongoGabon0
46=University of The GambiaGambia0
46=Université de ToliaraMadagascar0
46=Université Kasdi Merbah OuargiaAlgeria0
46=Universite de la ReunionReunion0
46=Universidade Eduardo MondlaneMozambique0
46=Université de OuagadougouBurkina Faso0
46=University of RwandaRwanda0
46=Universite de BamakoMali0
46=University of SwazilandSwaziland0
46=Université Félix Houphouët-BoignyIvory Coast
0
46=Université de KinshasaDemocratic republic of the Congo0
46=National University of LesothoLesotho0
46=Universidade Jean Piaget
de Cabo Verde
Cape Verde
0
46=National Engineering School of SfaxTunisia0
46=Université Marien NgouabiCongo republic0
46=University of LiberiaLiberia0
46=Université Djillali LiabesAlgeria0
46=Université Abdou Moumouni de NiameyNiger0
46=Misurata UniversityEgypt0
46=Université de DschangCameroons0
46=Université de BanguiCentral African Republic0
46=Université de NouakchottMauretania0
46=Eritrea Iinstitute of TechnologyEritrea0
46=Université de DjiboutiDjibout0
46=University of SeychellesSeychelles0
46=Mogadishu UniversitySomalia0
46=Universidad Nacional de
Guinea Ecuatorial
Equatorial Guinea0
46=Universite Gamal Abdel Nasser
de Conakry
Guinea0
46=University of MakeniSierria Leone0
46=John Garang Memorial UniversitySouth Sudan0
46=Hope African UniversityBurundi0
46=Universite de MoundouChad0

The Onion Analyses the US News Rankings

Just an extract. The whole thing is here.

  • Step 1: Schools are weighed on a scale
  • Step 2: Researchers calculate each campus’ student-to-student ratio
  • Step 3: Any college whose colors are maroon and gold is immediately eliminated

Friday, August 07, 2015

Error announcement from CWTS Leiden Ranking


See here for an error announcement from CWTS Leiden Ranking.

The prompt disclosure of the error adds to the credibility of the rankings.

Wednesday, August 05, 2015

Japan Targets the THE Rankings



The Wall Street Journal has an article about the proposed transformation of Japanese higher education. The national government is apparently using financial pressure to persuade universities to either become world class or local industry-orientated institutions.

World class means being in the top 100 of the THE world university rankings. Japan wants to have ten there. Now it only has two.

It is not a good idea to focus on just one ranking. If the Japanese government insists on aiming at just one then  THE might not be the best bet. The Shanghai rankings are more stable, reliable and transparent. In addition, it seems that the THE rankings are biased against Japan.

Tuesday, August 04, 2015

Degree Class is no Longer Important

The relentless grade inflation in British secondary and tertiary education has been well documented. A first or upper second class degree or a grade A at A level no longer means very much. It has  been far too easy for universities to cover up their deficiencies or attract applications by handing out firsts or upper seconds like smarties.

Now the consequences are beginning to become apparent. Ernst and Young (EY), the accounting firm, will no longer require applicants to have an upper second or three grade Bs. Instead they will use "numerical tests" and "strength assessments" to assess applicants.

I suspect that in a little while EY will come under fire for not recruiting enough of those groups that do not test well, especially in quantitative skills. They and others will probably join in the hunt for the Holy Grail of modern social science, the factor that is non-cognitive but still significantly predictive of career success.

Meanwhile, universities will try to find new functions to replace their historical role as the guarantor of a certain level of cognitive ability. Expect to see more conversations about assessing civic engagement or reaching out to communities.

Monday, August 03, 2015

The CWUR Rankings 2015

The Center for World University Rankings, based in Jeddah, Saudi Arabia, has produced the latest edition of its global ranking of 1,000 universities.  The Center is headed by Nadim Mahassen, an Assistant Professor at King Abdulaziz University.

The rankings include five indicators that measure various aspects of publication and research: publications in "reputable journals", research papers in "highly influential" journals, citations, h-index and patents.

These indicators are given a combined weighting of 25%.

Another 25% goes to Quality of Education, which is measured by the number of alumni receiving major international awards relative to size (current number of students according to national agencies). This is obviously a  crude measure which fails to distinguish among the great mass of universities that have never won an award.

Similarly, another quarter is assigned to Quality of Faculty measured by the number of faculty receiving such awards and another quarter to Alumni Employment measured by the number of CEOs of top corporations. Again, these indicators are of little or no relevance to all but a few hundred institutions.

Alumni employment gets another 25%. This is measured by alumni holding CEO positions in top companies. Again, this would be of relevance to a limited number of universities.

The Top Ten are:

1.    Harvard
2.    Stanford
3.    MIT
4.    Cambridge
5.    Oxford
6.    Columbia
7.    Berkeley
8.    Chicago
9.    Princeton
10.  Cornell.

The only change from last year is that Cornell has replaced Yale in tenth place.


Countries with Universities in the Top Hundred in 2015 and 2014


CountryUniversities in
 top 100 2015  
2014
US 5553          
UK77
Japan78
Switzerland  44
France44
Canada33
Israel33
South Korea21
Germany24
Australia2 2
China22
Netherlands21
Russia11
Taiwan11
Belgium11
Norway10
Sweden12
Singapore11
Denmark11
Italy              0                       1            


Top Ranked in Region or Country

USA:                                       Harvard
Canada:                                  Toronto
Asia:                                       Tokyo
South Asia:                             IIT Delhi
Southeast Asia :                      National University of Singapore
Europe:                                   Cambridge
Central and Eastern Europe:    Lomonosov Moscow State University
Arab World:                             King Saud University
Middle East:                             Hebrew University of Jerusalem
Latin America:                         Sao Paulo
Africa:                                      University of the Witwatersrand
Carribbean:                              University of Puerto Rico at Mayagüez



Noise Index


In the top 20, the CWUR rankings are more stable than THE and QS but less stable than the Shanghai rankings.

Average position change of universities in the top 20 in 2014: 0.5

Comparison

CWUR 2013-14:            0.9
Shanghai Rankings (ARWU)
2011-12:                        0.15
2012-13:                        0.25
THE WUR  2012-13:      1.2
QS  WUR    2012-13      1.7

With regard to the top 100, the CWUR rankings are more stable this year, with a volatility similar to the QS and THE rankings although significantly less so than ARWU.



Average position change of universities in the top 100 in 2014: 4.15

Comparison

CWUR 2013-14:           10.59

Shanghai Rankings (ARWU

2011-12:                           2.01
2012-13:                           1.66
THE WUR  2012-13:         5.36
QS  WUR    2012-13:    -   3.97


Saturday, August 01, 2015

The Other THE African University Rankings


THE have just presented Africa and the world with a list of 30 African universities ranked according to "research impact", that is the number of citations per paper normalised by field (300 of them?) and year. Citations are not just counted but compared with the world average for specific years and fields.

The result is that a university that manages to join a large international project, typically in medicine, genetics or particle physics with a disproportionate number of citations especially in the first couple of years of publication, can get an extremely high score. If the university had few publications to begin with the score for this indicator would be even higher.

For these rankings THE have introduced fractionalised counting. so that a university that is one of 100 contributors to a project with 2000 citations would get the equivalent of 20 citations. Under the procedure THE and the their former data collectors Thomson Reuters had been using for the world university rankings it would have been credited with 2000 citations as would all the other contributors.

THE are to be congratulated for finally using fractionalised counting which has reduced the likelihood of the indicator producing very odd results. Even so, the snapshot ranking is inappropriate for African universities, as it still privileges those that happen to contribute to to a few international projects.

The results might seem acceptable to THE and its international audiencebut I suspect that Egyptian academics will be amused by a ranking that includes six universities but not Cairo University . I wonder how many Nigerians will accept a ranking that includes Port Harcourt but not Ibadan or Ahmadu Bello.

Along with standardised scores for citations THE has also included the number of publications in the Scopus database between 2009 and 2013. This, a measurement of research output of a fairly high quality, is probably more relevant to Africa than the citations indicator. Unfortunately, it shows the very limited amount of research done between the Sahara and the Kalahari and so would be inexpedient to present as a snapshot of what a future ranking might look like.

The methods, approaches and assumptions of THE's world rankings with their emphasis on inputs, especially income, research quality, inappropriately called research impact or research influence, reputation, and doctoral education are of limited value to all but a few African universities and stakeholders. Whether anything of value comes from the conversation in Johannesburg remains to be seen but it is unlikely that a modified version of the world rankings will be of much value to anyone.

Anyway, below are the 30 African universities reordered according to number of publications.








RankUniversityCountrynumber of
 publications 2009-2013
1  University of Cape TownSouth Africa5540.21
2University of PretoriaSouth Africa4544.33
3University of the WitwatersrandSouth Africa4387.17
4  Stellenbosch UniversitySouth Africa4357.33
5University of Kwazulu-NatalSouth Africa4235.09
6Alexandria UniversityEgypt2550.15
7Universite de SfaxTunisia2355.30
8University of JohannesburgSouth Africa2192.74
9North West UniversitySouth Africa1707.94
10Assiut UniversityEgypt1588.64
11University of the Free StateSouth Africa1512.56
12Université Mohammed V – AgdalMorocco1503.69
13Rhodes UniversitySouth Africa1296.96
14University of the Western CapeSouth Africa1154.77
15Makerere UniversityUganda1112.69
16Suez Canal UniversityEgypt998.98
17University of South AfricaSouth Africa981.67
18Nelson Mandela Metropolitan UniversitySouth Africa885.77
19Universite Hassan II CasablancaMorocco960.25
20Universite Cadi AyyadMorocco 910.82
21Addis Ababa UniversityEthiopia893.90
22Univerite de TunisTunisia879.63
23Universite de Yaounde ICameroons876.33
24Ecole Nationale d’Ingénieurs de SfaxTunisia822.31
25University of GhanaGhana804.53
26American University in CairoEgypt700.89
27Minia UniversityEgypt694.79
28University of NairobiKenya671.72
29South Valley UniversityEgypt636.85
30University of Port HarcourtNigeria573.55




Wednesday, July 29, 2015

Google Scholar Ranking of African Universities


As competition in the ranking world intensifies, Times Higher Education (THE) and Quacquarelli Symonds are  diligently promoting their various regional ranking, data processing and event management projects. The latest is THE's African summit at the University of Johannesburg.

Three weeks ago THE issued what they described as an "experimental and preliminary" ranking which consisted of 15 universities ordered according to the number of citations per paper normalised for field and year. An interesting innovation was that citations were fractionalised so that participants in large collaborative projects would be credited in proportion to their fraction of the total contributors .

This is just one indicator and it is not really a measure of research influence, but rather of research quality and it is still skewed by participation in multi-contributor papers in medicine and particle physics. It is unlikely that the University of Port Harcourt or Universite Cadi Ayyad would  be in the top ten of any other indicator.

THE have indicated that they will add another 15 names to the list at the Johannesburg summit.

The table below was compiled for the purpose of checking on the claims of THE or other rankers that might attempt to evaluate African universities. It simply counts the number of results (2012-2014: exclude citations and patents) from a query to Google Scholar. Data was compiled on the 25th and 26th of July. The criteria for inclusion were being in the top 50 of the Webometrics rankings or the 15 universities in the THE list. The top university in any country not included was added from either the Webometrics or ic4u rankings.

This database includes papers, reports, theses and dissertations, conference proceedings and so on. It is certainly not a measure of research quality but rather of the volume of any activities connected with research.  In the case of the two Kenyan universities it probably reflects the size and inclusiveness of the university repositories.

One thing about the Google scholar list is that it confirms suspicions that the quality of Egyptian universities has been underestimated by the big name rankers. For further evidence one might look at data from social media such as LinkedIn or just contrast the aspirations of Egyptian students in the revolutions of 2011 and 2013 compared with those of students at the University of Cape Town and Durban University of Technology.






RankUniversityCountryGoogle Scholar
 Results
1  University of Cape TownSouth Africa17,000
2Cairo UniversityEgypt16,800
3University of PretoriaSouth Africa16,500
4  University of NairobiKenya16,400
5University of the WitwatersrandSouth Africa15,800
6University of Kwazulu-NatalSouth Africa15,500
7Stellenbosch UniversitySouth Africa14,900
8University of IbadanNigeria14,800
9University of South AfricaSouth Africa13,500
10Kenyatta UniversityKenya12,000
11University of JohannesburgSouth Africa11,200
12Makerere UniversityUganda10,400
13North West UniversitySouth Africa10,100
14University of GhanaGhana8,330
15Alexandria University Egypt7,610
16University of LagosNigeria7,220
17Rhodes UniversitySouth Africa7,210
18University of the Western CapeSouth Africa6,870
19Obafemi Awolowo UniversityNigeria6,800
20Mansoura UniversityEgypt6,480
21University of the Free StateSouth Africa6,400
22Addis Ababa UniversityEthiopia6,210
23Zagazig UniversityEgypt6,160
24American University in CairoEgypt5,770
25University of IlorinNigeria5,620
26Assiut UniversityEgypt5,580
27Kwame Nkrumah University of Science and TechnologyGhana5,080
28=University of ZimbabweZimbabwe4,830
28=University of Port HarcourtNigeria4,830
30University of BotswanaBotswana4,260
31University of ZambiaZambia4,240
32University of Dar Es SalaamTanzania4,120
33University of KhartoumSudan4,110
34Suez Canal UniversityEgypt3,670
35Tanta UniversityEgypt3,600
36Jomo Kenyatta University of Agriculture and TechnologyKenya            3,520
37Nelson Mandela Metropolitan UniversitySouth Africa3,490
38Covenant University OtaNigeria2,950
39Helwan UniversityEgypt2,940
40Benha UniversityEgypt2,570
41Minia UniversityEgypt2,390
42University of MalawiMalawi2,340
43Université Abou Bekr BelkaidAlgeria2,290
44Universityof TunisTunisia2,270
45=Université Kasdi Merbah OuarglaAlgeria2,240
45=Cape Peninsula University of TechnologySouth Africa2,240
47Université Cheikh Anta Diop de DakarSenegal1,950
48University of NamibiaNamibia1,760
49Universite de la ReunionReunion1,690
50Durban University of TechnologySouth Africa1,560
51University of MauritiusMauritius1,490
52Université d'Abomey-CalaviBenin1,460
53South Valley UniversityEgypt1,440
54Universidade Eduardo MondlaneMozambique   1,420
55Beni-Suef UniversityEgypt1,400
56Université Cadi Ayyad MarrakechMorroco1,370
57Université de OuagadougouBurkina Faso1,300
58University of RwandaRwanda1,270
59Université des Sciences et de la Technologie Houari BoumedieneAlgeria976
60Université de LoméTogo784
61Université de BamakoMali660
62Kafrelsheikh UniversityEgypt618
63University of SwazilandSwaziland615
64Université Félix Houphouët-BoignyIvory Coast590
65Université de KinshasaDomocratic Republic of the Congo558
66National University of LesothoLesotho555
67Université Constantine 1 Algeria468
68Bejaia UniversityAlgeria413
69Universidade Jean Piaget de Cabo VerdeCape Verde407
70Université Mohammed V SouissiTunisia361
71National Engineering School of SfaxTunisia271
72Université Marien NgouabiRepublc of Congo256
73University of LiberiaLiberia255
74Université Djillali LiabesAlgeria243
75Université Abdou Moumouni de NiameyNiger206
76Misurata UniversityLibya155
77Université Omar BongoGabon138
78University of The GambiaGambia130
79Universidade Católica de AngolaAngola115
80=Université de DschangCameroons113
80=Université de BanguiCentral African Empire113
82Université de NouakchottMauretania108
83Eritrea Institute of TechnologyEritrea76
84Université de DjiboutiDjibouti66
85Université de ToliaraMadagascar59
86Université Hassan II Ain ChockMorocco55
87University of SeychellesSeychelles52
88Mogadishu UniversitySomalia51
89Universidad Nacional de Guinea EcuatorialEquatorial Guinea40
90Universite Gamal Abdel Nasser de ConakryGuinea21
91University of MakeniSierra Leone18
92John Garang Memorial UniversitySouth Sudan12
93Hope Africa UniversityBurundi3
94Universite de Moundou Chad2




Tuesday, July 28, 2015

Would anyone notice if a small, old but awfully clever dog filled in a university ranking survey, and would it make a difference?

The Australian newspaper The Age has a piece by Erica Cervini on how she allowed her dog to complete the QS academic reputation survey on the quality of veterinary schools.

She doesn't elaborate on how the dog chose the schools. Was it by barking or tail wagging when shown pictures of the buildings?

Seriously though, she does have a point. Can QS stop people signing up just to support their employer or outvote their rivals?

To be fair, QS are aware that their surveys might be manipulated and have taken steps over the years to prevent this by such means as forbidding respondents from voting for their declared employer or repeat voting from the same computer. Even so, it seems that some universities, especially in Latin America, are getting scores in the reputation surveys that appear too high,especially when compared with their overall scores. In the employer survey the Pontifical Catholic University of Chile is 56th (overall 167) and the University of Buenos Aires 49th (overall 198). In the academic survey the University of Sao Paulo is 54th (overall 132 and the National Autonomous University of Mexico 55th (overall 175).

QS are apparently considering reforming their citations per faculty indicator and allowing unchanged responses to the surveys to be recycled for five instead of three years. This is welcome but a more rigorous overhaul of the reputation indicators is sorely needed.