This year and next the international university rankings appear to be set for more volatility with unusually large upward and downward movement, partly as a result of changes to the methodology for counting citations in the QS and THE rankings.
ARWU
The global ranking season kicked off last week with the publication of the latest edition of the Academic Ranking of World Universities from the ShanghaiRanking Consultancy (SRC), which I hope to discuss in detail in a little while. These rankings are rather dull and boring, which is exactly what they should be. Harvard is, as always, number one for all but one of the indicators. Oxford has slipped from joint ninth to tenth place. Warwick has leaped into the top 100 by virtue of a Fields medal. At the foot of the table there are new contenders from France, Korea and Iran.
Since they began in 2003 the Shanghai rankings have been characterised by a generally stable methodology. In 2012, however, they had to deal with the recruitment of a large and unprecedented number of adjunct faculty by King Abdulaziz University. Previously SRC had simply divided the credit for the Highly Cited Researchers indicator equally between all institutions listed as affiliations. In 2012 and 2013 they wrote to all highly cited researchers with joint affiliations and thus determined the division of credit between primary and secondary affiliations. Then, in 2014 and this year they combined the old Thomson Reuters list, first issued in 2001, and the new one, issued in 2014, and excluded all secondary affiliations in the new list.
The result was that in 2014 the rankings showed an unusual degree of volatility although this year things are a lot more stable. My understanding is that Shanghai will move to counting only the new list next year, again without secondary affiliations, so there should be a lot of interesting changes then. It looks as though Stanford, Princeton, University of Wisconsin -- Madison, and Kyoto University will suffer because of the change while University of California Santa Cruz, Rice University, University of Exeter and University of Wollongong. will benefit.
While SRC has efficiently dealt with the issue of secondary affiliation with regard to its Highly Cited indicator, the issue has now resurfaced in the unusual high scores achieved by King Abdulaziz University for publications largely because of its adjunct faculty. Expect more discussion over the next year or so. It would seem sensible for SRC to think about a five or ten year period rather than one year for their Publications indicator and academic publishers, the media and rankers in general may need to give some thought to the proliferation of secondary affiliations.
QS
On July 27 Quacquarelli Symonds (QS) announced that for 18 months they had been thinking about normalising the counting of citations across five broad subject areas. They observed that a typical institution would receive about half of its citations from the life sciences and medicine, over a quarter from the natural sciences but just 1% from the arts and humanities.
In their forthcoming rankings QS will assign a 20% weighting for citations to each of the five subject areas something, according to Ben Sowter Research Director at QS, that they have been doing for the academic opinion survey.
It would seem then that there are likely to be some big rises and big falls this September. I would guess that places strong in humanities, social sciences and engineering like LSE, New York University and Nanyang Technological University may go up and some of the large US state universities and Russian institutions may go down. That's a guess because it is difficult to tell what happens with the academic and employer surveys.
QS have also made an attempt to deal with the issue of hugely cited papers with hundreds, even thousands of "authors" -- contributors would be a better term -- mainly in physics, medicine and genetics. Their approach is to exclude all papers with more than 10 contributing institutions, that is 0.34% of all publications in the database.
This is rather disappointing. Papers with huge numbers of authors and citations obviously do have distorting effects but they have often dealt with fundamental and important issues. To exclude them altogether is to ignore a very significant body of research.
The obvious solution to the problem of multi-contributor papers is fractional counting, dividing the number of citations by the number of contributors or contributing institutions. QS claim that to do so would discourage collaboration, which does not sound very plausible.
In addition, QS will likely extend the life of survey responses from three to five years. That could make the rankings more stable by smoothing out annual fluctuations in survey responses and reduce the volatility caused by the proposed changes in the counting of citations.
The shift to a moderate version of field normalisation is helpful as it will reduce the undue privilege given to medical research, without falling into the huge problems that result from using too many categories. It is unfortunate, however, that QS have not taken the plunge into fractional counting. One suspects that technical problems and financial considerations might be as significant as the altruistic desire not to discourage collaboration.
After a resorting in September the QS rankings are likely to become a bit more stable and and credible but their most serious problem, the structure, validity and excessive weighting of the academic survey, has still not been addressed.
THE
Meanwhile, Times Higher Education (THE) has also been grappling with the issue of authorship inflation. Phil Baty has announced that this year 649 papers with over 1,000 authors will be excluded from their calculation of citations because " we consider them to be so freakish that they have the potential to distort the global scientific landscape".
But it is not the papers that do the distorting. It is methodology. THE and their former data partners Thomson Reuters, like QS, have avoided fractional counting (except for a small experimental African ranking) and so every one of those hundreds or thousands of authors gets full credit for the hundreds or thousands of citations. This has given places like Tokyo Metropolitan University, Scuola Normale Superiore Pisa, Universite Cadi Ayyad in Morocco and Bogazici University in Turkey remarkably high scores for Citations: Research Impact, much higher than their scores for the bundled research indicators.
THE have decided to simply exclude 649 papers, 0r 0.006% of the total from their calculations for the world rankings. This seems a lot less than QS. Again, this is a rather crude measure. Many of the "freaks" are major contributions to advanced research and deserve to be acknowledged by the rankings in some way.
THE did use fractional counting in their recent experimental ranking of African universities and Baty indicates that they are considering doing so in the future.
It would be a big step forward for THE if they introduce fractional counting of citations. But they should not stop there. There are other bugs in the citations indicator that ought to be fixed.
First, it does not at present measure what it is supposed to measure. It does not measure a university's overall research impact. At best, it is a measure of the average quality of research papers no matter how few (above a certain threshold) they are.
Second, the "regional modification", which divides the university citation impact score by the square root of the the score of the country where the university is located, is another source of distortion. It gives a bonus to universities simply for being located in underperforming countries. THE or TR have justified the modification by suggesting that some universities deserve compensation because they lack funding or networking opportunities. Perhaps they do, but this can still lead to serious anomalies.
Thirdly, THE need to consider whether they should assign citations to so many fields since this increases the distortions that can arise when there is a highly cited paper in a normally lowly cited field.
Fourthly, should they assign a thirty per cent weighting to an indicator that may be useful for distinguishing between the likes of MIT and Caltech but may be of little relevance for the universities that are now signing up for the world rankings?
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Sunday, August 23, 2015
Sunday, August 16, 2015
Research Ranking of African Universities
I suppose it should no longer be surprising that university heads flock to Johannesburg for the unveiling of an "experimental" research ranking of 30 African universities that put the University of Port Harcourt in 6th place, did not include Cairo University, the University of Ibadan, Ahmadu Bello University or the University of Nigeria Nsukka, placed Makerere above Stellenbosch and Universite Cadi Ayyad above the University of Pretoria.
It is still a bit odd that African universities seem to have ignored a reasonable and sensible research ranking from the Journals Consortium that I found while reading an article by Gerald Ouma in the Mail and Guardian Africa, which, by the way, had an advertisement about Universite Cadi Ayyad being number ten in the THE African ranking.
The Journals Consortium ranking is based on publications, citations and web visibility and altogether 1,447 institutions are ranked. The methodology, which is a bit thin, is here.
Here are the top ten.
1. University of Cape Town
2. Cairo University
3. University of Pretoria
4. University of Nairobi
5. University of South Africa
6. University of the Witwatersrand
7. Stellenbosch University
8. University of Ibadan
9. University of Kwazulu-Natal
10. Ain Shams University
The University of Port Harcourt is 36th and Universite Cadi Ayyad is 89th.
I am pleased to note that two of my former employers are in the rankings, University of Maiduguri in 66th place and Umar ibn Ibrahim El-Kanemi College of Education, Science and Technology (formerly Borno College of Basic Studies) in 988th.
It is still a bit odd that African universities seem to have ignored a reasonable and sensible research ranking from the Journals Consortium that I found while reading an article by Gerald Ouma in the Mail and Guardian Africa, which, by the way, had an advertisement about Universite Cadi Ayyad being number ten in the THE African ranking.
The Journals Consortium ranking is based on publications, citations and web visibility and altogether 1,447 institutions are ranked. The methodology, which is a bit thin, is here.
Here are the top ten.
1. University of Cape Town
2. Cairo University
3. University of Pretoria
4. University of Nairobi
5. University of South Africa
6. University of the Witwatersrand
7. Stellenbosch University
8. University of Ibadan
9. University of Kwazulu-Natal
10. Ain Shams University
The University of Port Harcourt is 36th and Universite Cadi Ayyad is 89th.
I am pleased to note that two of my former employers are in the rankings, University of Maiduguri in 66th place and Umar ibn Ibrahim El-Kanemi College of Education, Science and Technology (formerly Borno College of Basic Studies) in 988th.
Friday, August 14, 2015
This is also really frightening
From The Times, which is supposed to be a really posh paper -- I remember adverts "Top People Read the Times" -- read by people with degrees from Russell Group universities:
"Of the 3 million Muslims in Britain, about 2.3 million identify as Sunni, compared with 300,000 Shias, or 5 per cent of the total. Most British Shias have roots in Iran, Iraq, Azerbaijan or Bahrain. Sunnis make up the vast majority of Muslims worldwide."
"Of the 3 million Muslims in Britain, about 2.3 million identify as Sunni, compared with 300,000 Shias, or 5 per cent of the total. Most British Shias have roots in Iran, Iraq, Azerbaijan or Bahrain. Sunnis make up the vast majority of Muslims worldwide."
Thursday, August 13, 2015
This is really frightening
The evidence that human intelligence is falling continues to accumulate. PISA scores in Sweden are down and not just among immigrants. The intelligence of US marines, as measured by the General Classification Test, has been in decline since the 1980s. Based on "a small, probably representative sample" the French national IQ has dropped since 1999.
And now we have this from an article about the possible revival of the Liberal Democrats by Jon Stone, who is a reporter, in the Independent, which is a newspaper.
"Lazarus is a character in the Christian holy book The Bible who comes back from the dead after an intervention by Jesus Christ, a religious figure."
I thought the Independent was one of the posh papers read by bright people who had degrees and knew how ignorant and illiterate UKIP supporters were.
Does Jon Stone really think he has to explain to his readers what the Bible is? Or is this some sort of PC policy?
The worse thing is that he apparently doesn't know that Lazarus was really a character in a Robert Heinlein novel.
And now we have this from an article about the possible revival of the Liberal Democrats by Jon Stone, who is a reporter, in the Independent, which is a newspaper.
"Lazarus is a character in the Christian holy book The Bible who comes back from the dead after an intervention by Jesus Christ, a religious figure."
Does Jon Stone really think he has to explain to his readers what the Bible is? Or is this some sort of PC policy?
The worse thing is that he apparently doesn't know that Lazarus was really a character in a Robert Heinlein novel.
Wednesday, August 12, 2015
The Plague of Authorship Inflation
An article in the Wall Street Journal by Robert Lee Hotz describes the apparently inexorable increase in the number of authors of scientific papers.
In 2014 according to the Web of Science the number of papers with 50 or more authors reached over 1400 and the number with 500 or more was over 200. The situation is getting so bad that one journal, Nature, was unable to list all the autors of a paper in the print edition .
Hotz has an amusing digression where he recounts how scientists have listed a hamster, a dog and a computer as co-authors
One issue that he does not explore is the way in which multi-authorship has distorted global university rankings. Times Higher Education and Thomson Reuters until this year declined to use fractional counting of citations in their World University Rankings so that every one of hundreds of contributors was credited with every one of thousands of citations. When this was combined with normalisation by 250 fields so that a few citations could have a disproportionate effect and a deceptive regional modification that rewarded universities for being in a country that produced few citations then the results could be ludicrous. Unproductive institutions, for example Alexandria University, those that are very small, for example Scuala Normale Superiore Pisa, or very specialised, for example Moscow State Engineering Physics Institute, have been presented by THE as world leaders for research impact.
Let us hope that this indicator is reformed in the forthcoming world rankings.
In 2014 according to the Web of Science the number of papers with 50 or more authors reached over 1400 and the number with 500 or more was over 200. The situation is getting so bad that one journal, Nature, was unable to list all the autors of a paper in the print edition .
Hotz has an amusing digression where he recounts how scientists have listed a hamster, a dog and a computer as co-authors
One issue that he does not explore is the way in which multi-authorship has distorted global university rankings. Times Higher Education and Thomson Reuters until this year declined to use fractional counting of citations in their World University Rankings so that every one of hundreds of contributors was credited with every one of thousands of citations. When this was combined with normalisation by 250 fields so that a few citations could have a disproportionate effect and a deceptive regional modification that rewarded universities for being in a country that produced few citations then the results could be ludicrous. Unproductive institutions, for example Alexandria University, those that are very small, for example Scuala Normale Superiore Pisa, or very specialised, for example Moscow State Engineering Physics Institute, have been presented by THE as world leaders for research impact.
Let us hope that this indicator is reformed in the forthcoming world rankings.
Sunday, August 09, 2015
Another Ranking Indicator for Africa
The prestigious and exclusive THE African summit is over. Whether it will lead to a serious regional ranking remains to be seen. The indicators used by THE in their world rankings and various regional spin-offs seem generally inappropriate to all but about two dozen institutions: reputation for research, income in three different indicators, citations, number of doctoral students.
But there is still a need to compare and evaluate the effectiveness of African universities in providing instruction in academic, technical and professional subjects and perhaps in their participation in innovative and economically beneficial projects.
Probably the way ahead for African ranking is the use of social media, bypassing the very problematical collection of institutional data. More of that later.
Anyway, here is a ranking of African universities according to the number of results from a search of the WIPO Patentscope site. Searching was done on the 5th and 6th of August. Universities included the top 50 African universities in Webometrics and any university in the recent THE pilot ranking. All fields were searched.
There are no real surprises. South Africa is dominant, followed by Egypt. The flagships of Uganda, Kenya, Ghana and Nigeria are represented. Most universities in Africa do no innovative research reflected in patents.
Rank | University | Country | References in patents any field |
---|---|---|---|
1 | University of Cape Town | South Africa | 377 |
2 | University of Pretoria | South Africa | 242 |
3 | University of the Witwatersrand | South Africa | 217 |
4 | Stellenbosch University | South Africa | 165 |
5 | North West University | South Africa | 125 |
6 | Cairo University | Egypt | 100 |
7 | University of the Free State | South Africa | 72 |
8 | University of Johannesburg | South Africa | 46 |
9 | University of Kwazulu-Natal | South Africa | 41 |
10 | Nelson Mandela Metropolitan University | South Africa | 34 |
11 | Assiut University | Egypt | 31 |
12 | Rhodes University | South Africa | 30 |
13 | University of Nairobi | Kenya | 21 |
14 | Makerere University | Uganda | 20 |
15 | University of the Western Cape | South Africa | 18 |
16 | American University in Cairo | Egypt | 17 |
17 | University of Ghana | Ghana |
13
|
18 | Université Mohammed V Souissi | Morocco | 12 |
19 | Cape Peninsula University of Technology | South Africa | 11 |
20 | Mansoura University | Egypt | 10 |
21 | University of Namibia | Namibia | 9 |
22 | Alexandria University | Egypt | 8 |
23 | University of Ibadan | Nigeria |
7
|
24= | Kenyatta University | Kenya | 6 |
24= | University of Zimbabwe | Zimbabwe | 6 |
24= | Durban University of Technology | South Africa | 6 |
27= | University of South Africa | South Africa | 5 |
27= | Zagazig University | Egypt | 5 |
27= | Suez Canal University | Egypt | 5 |
30= | University of Dar Es Salaam | Tanzania | 4 |
30= | Addis Ababa University | Ethiopia | 4 |
32= | University of Ilorin | Nigeria | 3 |
32= | University of Khartoum | Sudan | 3 |
32= | University of Malawi | Malawi | 3 |
35= | Helwan University | Egypt | 2 |
35= | Université Hassan II Ain Chock | Morocco | 2 |
35= | Université Cadi Ayyad Marrake | Morocco | 2 |
35= | Kafrelsheikh University | Egypt | 2 |
35= | University of Zambia | Zambia | 2 |
35= | Ahmadu Bello University | Nigeria | 2 |
41= | University of Lagos | Nigeria |
1
|
41= | Université Cheikh Anta Diop | Senegal | 1 |
41= | University of Mauritius | Mauritius | 1 |
41= | Université de Constantine 1 | Algeria | 1 |
41= | Université de Yaounde 1 | Cameroons | 1 |
46= | Obafemi Awolowo University | Nigeria | 0 |
46= | Kwame Nkrumah University of Science and Technology | Ghana | 0 |
46= | University of Port Harcourt | Nigeria | 0 |
46= | University of Botswana | Botswana | 0 |
46= | Tanta University | Egypt | 0 |
46= | Kwame Nkrumah University of Science and Technology | Ghana | 0 |
46= | University of Port Harcourt | Nigeria | 0 |
46= | Covenant University | Nigeria | 0 |
46= | Bejaia university | Algeria | 0 |
46= | University of Botswana | Botswana | 0 |
46= | Minia University | Egypt | 0 |
46= | University of Tunis | Tunisia | 0 |
46= | Benha University | Algeria | 0 |
46= | Universidade Católica de Angola | Angola | 0 |
46= | Université de Lomé | Togo | 0 |
46= | South Valley University | Egypt | 0 |
46= | Université Abou Bekr Belkaid | Algeria | 0 |
46= | Beni-Suef university | Egypt | 0 |
46= | Université Omar Bongo | Gabon | 0 |
46= | University of The Gambia | Gambia | 0 |
46= | Université de Toliara | Madagascar | 0 |
46= | Université Kasdi Merbah Ouargia | Algeria | 0 |
46= | Universite de la Reunion | Reunion | 0 |
46= | Universidade Eduardo Mondlane | Mozambique | 0 |
46= | Université de Ouagadougou | Burkina Faso | 0 |
46= | University of Rwanda | Rwanda | 0 |
46= | Universite de Bamako | Mali | 0 |
46= | University of Swaziland | Swaziland | 0 |
46= | Université Félix Houphouët-Boigny | Ivory Coast |
0
|
46= | Université de Kinshasa | Democratic republic of the Congo | 0 |
46= | National University of Lesotho | Lesotho | 0 |
46= | Universidade Jean Piaget de Cabo Verde | Cape Verde |
0
|
46= | National Engineering School of Sfax | Tunisia | 0 |
46= | Université Marien Ngouabi | Congo republic | 0 |
46= | University of Liberia | Liberia | 0 |
46= | Université Djillali Liabes | Algeria | 0 |
46= | Université Abdou Moumouni de Niamey | Niger | 0 |
46= | Misurata University | Egypt | 0 |
46= | Université de Dschang | Cameroons | 0 |
46= | Université de Bangui | Central African Republic | 0 |
46= | Université de Nouakchott | Mauretania | 0 |
46= | Eritrea Iinstitute of Technology | Eritrea | 0 |
46= | Université de Djibouti | Djibout | 0 |
46= | University of Seychelles | Seychelles | 0 |
46= | Mogadishu University | Somalia | 0 |
46= | Universidad Nacional de Guinea Ecuatorial | Equatorial Guinea | 0 |
46= | Universite Gamal Abdel Nasser de Conakry | Guinea | 0 |
46= | University of Makeni | Sierria Leone | 0 |
46= | John Garang Memorial University | South Sudan | 0 |
46= | Hope African University | Burundi | 0 |
46= | Universite de Moundou | Chad | 0 |
The Onion Analyses the US News Rankings
Just an extract. The whole thing is here.
- Step 1: Schools are weighed on a scale
- Step 2: Researchers calculate each campus’ student-to-student ratio
- Step 3: Any college whose colors are maroon and gold is immediately eliminated
Friday, August 07, 2015
Error announcement from CWTS Leiden Ranking
See here for an error announcement from CWTS Leiden Ranking.
The prompt disclosure of the error adds to the credibility of the rankings.
Wednesday, August 05, 2015
Japan Targets the THE Rankings
The Wall Street Journal has an article about the proposed transformation of Japanese higher education. The national government is apparently using financial pressure to persuade universities to either become world class or local industry-orientated institutions.
World class means being in the top 100 of the THE world university rankings. Japan wants to have ten there. Now it only has two.
It is not a good idea to focus on just one ranking. If the Japanese government insists on aiming at just one then THE might not be the best bet. The Shanghai rankings are more stable, reliable and transparent. In addition, it seems that the THE rankings are biased against Japan.
Tuesday, August 04, 2015
Degree Class is no Longer Important
The relentless grade inflation in British secondary and tertiary education has been well documented. A first or upper second class degree or a grade A at A level no longer means very much. It has been far too easy for universities to cover up their deficiencies or attract applications by handing out firsts or upper seconds like smarties.
Now the consequences are beginning to become apparent. Ernst and Young (EY), the accounting firm, will no longer require applicants to have an upper second or three grade Bs. Instead they will use "numerical tests" and "strength assessments" to assess applicants.
I suspect that in a little while EY will come under fire for not recruiting enough of those groups that do not test well, especially in quantitative skills. They and others will probably join in the hunt for the Holy Grail of modern social science, the factor that is non-cognitive but still significantly predictive of career success.
Meanwhile, universities will try to find new functions to replace their historical role as the guarantor of a certain level of cognitive ability. Expect to see more conversations about assessing civic engagement or reaching out to communities.
Now the consequences are beginning to become apparent. Ernst and Young (EY), the accounting firm, will no longer require applicants to have an upper second or three grade Bs. Instead they will use "numerical tests" and "strength assessments" to assess applicants.
I suspect that in a little while EY will come under fire for not recruiting enough of those groups that do not test well, especially in quantitative skills. They and others will probably join in the hunt for the Holy Grail of modern social science, the factor that is non-cognitive but still significantly predictive of career success.
Meanwhile, universities will try to find new functions to replace their historical role as the guarantor of a certain level of cognitive ability. Expect to see more conversations about assessing civic engagement or reaching out to communities.
Monday, August 03, 2015
The CWUR Rankings 2015
The Center for World University Rankings, based in Jeddah, Saudi Arabia, has produced the latest edition of its global ranking of 1,000 universities. The Center is headed by Nadim Mahassen, an Assistant Professor at King Abdulaziz University.
The rankings include five indicators that measure various aspects of publication and research: publications in "reputable journals", research papers in "highly influential" journals, citations, h-index and patents.
These indicators are given a combined weighting of 25%.
Another 25% goes to Quality of Education, which is measured by the number of alumni receiving major international awards relative to size (current number of students according to national agencies). This is obviously a crude measure which fails to distinguish among the great mass of universities that have never won an award.
The rankings include five indicators that measure various aspects of publication and research: publications in "reputable journals", research papers in "highly influential" journals, citations, h-index and patents.
These indicators are given a combined weighting of 25%.
Another 25% goes to Quality of Education, which is measured by the number of alumni receiving major international awards relative to size (current number of students according to national agencies). This is obviously a crude measure which fails to distinguish among the great mass of universities that have never won an award.
Similarly, another quarter is assigned to Quality of Faculty measured by the number of faculty receiving such awards and another quarter to Alumni Employment measured by the number of CEOs of top corporations. Again, these indicators are of little or no relevance to all but a few hundred institutions.
Alumni employment gets another 25%. This is measured by alumni holding CEO positions in top companies. Again, this would be of relevance to a limited number of universities.
1. Harvard
2. Stanford
3. MIT
4. Cambridge
5. Oxford
6. Columbia
7. Berkeley
8. Chicago
9. Princeton
10. Cornell.
The only change from last year is that Cornell has replaced Yale in tenth place.
Countries with Universities in the Top Hundred in 2015 and 2014
Countries with Universities in the Top Hundred in 2015 and 2014
Country | Universities in top 100 2015 | 2014 |
---|---|---|
US | 55 | 53 |
UK | 7 | 7 |
Japan | 7 | 8 |
Switzerland | 4 | 4 |
France | 4 | 4 |
Canada | 3 | 3 |
Israel | 3 | 3 |
South Korea | 2 | 1 |
Germany | 2 | 4 |
Australia | 2 | 2 |
China | 2 | 2 |
Netherlands | 2 | 1 |
Russia | 1 | 1 |
Taiwan | 1 | 1 |
Belgium | 1 | 1 |
Norway | 1 | 0 |
Sweden | 1 | 2 |
Singapore | 1 | 1 |
Denmark | 1 | 1 |
Italy | 0 | 1 |
Top Ranked in Region or Country
USA: Harvard
Canada: Toronto
Asia: Tokyo
South Asia: IIT Delhi
Southeast Asia : National University of Singapore
Europe: Cambridge
Central and Eastern Europe: Lomonosov Moscow State University
Arab World: King Saud University
Middle East: Hebrew University of Jerusalem
Latin America: Sao Paulo
Africa: University of the Witwatersrand
USA: Harvard
Canada: Toronto
Asia: Tokyo
South Asia: IIT Delhi
Southeast Asia : National University of Singapore
Europe: Cambridge
Central and Eastern Europe: Lomonosov Moscow State University
Arab World: King Saud University
Middle East: Hebrew University of Jerusalem
Latin America: Sao Paulo
Africa: University of the Witwatersrand
Carribbean: University of Puerto Rico at Mayagüez
Noise Index
In the top 20, the CWUR rankings are more stable than THE and QS but less stable than the Shanghai rankings.
Average position change of universities in the top 20 in 2014: 0.5
Comparison
CWUR 2013-14: 0.9
Shanghai Rankings (ARWU)
2011-12: 0.15
2012-13: 0.25
THE WUR 2012-13: 1.2
QS WUR 2012-13 1.7
With regard to the top 100, the CWUR rankings are more stable this year, with a volatility similar to the QS and THE rankings although significantly less so than ARWU.
Average position change of universities in the top 100 in 2014: 4.15
Comparison
CWUR 2013-14: 10.59
Shanghai Rankings (ARWU
2011-12: 2.01
2012-13: 1.66
THE WUR 2012-13: 5.36
QS WUR 2012-13: - 3.97
Noise Index
In the top 20, the CWUR rankings are more stable than THE and QS but less stable than the Shanghai rankings.
Average position change of universities in the top 20 in 2014: 0.5
Comparison
CWUR 2013-14: 0.9
Shanghai Rankings (ARWU)
2011-12: 0.15
2012-13: 0.25
THE WUR 2012-13: 1.2
QS WUR 2012-13 1.7
With regard to the top 100, the CWUR rankings are more stable this year, with a volatility similar to the QS and THE rankings although significantly less so than ARWU.
Average position change of universities in the top 100 in 2014: 4.15
Comparison
CWUR 2013-14: 10.59
Shanghai Rankings (ARWU
2011-12: 2.01
2012-13: 1.66
THE WUR 2012-13: 5.36
QS WUR 2012-13: - 3.97
Saturday, August 01, 2015
The Other THE African University Rankings
THE have just presented Africa and the world with a list of 30 African universities ranked according to "research impact", that is the number of citations per paper normalised by field (300 of them?) and year. Citations are not just counted but compared with the world average for specific years and fields.
The result is that a university that manages to join a large international project, typically in medicine, genetics or particle physics with a disproportionate number of citations especially in the first couple of years of publication, can get an extremely high score. If the university had few publications to begin with the score for this indicator would be even higher.
For these rankings THE have introduced fractionalised counting. so that a university that is one of 100 contributors to a project with 2000 citations would get the equivalent of 20 citations. Under the procedure THE and the their former data collectors Thomson Reuters had been using for the world university rankings it would have been credited with 2000 citations as would all the other contributors.
THE are to be congratulated for finally using fractionalised counting which has reduced the likelihood of the indicator producing very odd results. Even so, the snapshot ranking is inappropriate for African universities, as it still privileges those that happen to contribute to to a few international projects.
The results might seem acceptable to THE and its international audiencebut I suspect that Egyptian academics will be amused by a ranking that includes six universities but not Cairo University . I wonder how many Nigerians will accept a ranking that includes Port Harcourt but not Ibadan or Ahmadu Bello.
Along with standardised scores for citations THE has also included the number of publications in the Scopus database between 2009 and 2013. This, a measurement of research output of a fairly high quality, is probably more relevant to Africa than the citations indicator. Unfortunately, it shows the very limited amount of research done between the Sahara and the Kalahari and so would be inexpedient to present as a snapshot of what a future ranking might look like.
The methods, approaches and assumptions of THE's world rankings with their emphasis on inputs, especially income, research quality, inappropriately called research impact or research influence, reputation, and doctoral education are of limited value to all but a few African universities and stakeholders. Whether anything of value comes from the conversation in Johannesburg remains to be seen but it is unlikely that a modified version of the world rankings will be of much value to anyone.
Anyway, below are the 30 African universities reordered according to number of publications.
Rank | University | Country | number of publications 2009-2013 |
---|---|---|---|
1 | University of Cape Town | South Africa | 5540.21 |
2 | University of Pretoria | South Africa | 4544.33 |
3 | University of the Witwatersrand | South Africa | 4387.17 |
4 | Stellenbosch University | South Africa | 4357.33 |
5 | University of Kwazulu-Natal | South Africa | 4235.09 |
6 | Alexandria University | Egypt | 2550.15 |
7 | Universite de Sfax | Tunisia | 2355.30 |
8 | University of Johannesburg | South Africa | 2192.74 |
9 | North West University | South Africa | 1707.94 |
10 | Assiut University | Egypt | 1588.64 |
11 | University of the Free State | South Africa | 1512.56 |
12 | Université Mohammed V – Agdal | Morocco | 1503.69 |
13 | Rhodes University | South Africa | 1296.96 |
14 | University of the Western Cape | South Africa | 1154.77 |
15 | Makerere University | Uganda | 1112.69 |
16 | Suez Canal University | Egypt | 998.98 |
17 | University of South Africa | South Africa | 981.67 |
18 | Nelson Mandela Metropolitan University | South Africa | 885.77 |
19 | Universite Hassan II Casablanca | Morocco | 960.25 |
20 | Universite Cadi Ayyad | Morocco | 910.82 |
21 | Addis Ababa University | Ethiopia | 893.90 |
22 | Univerite de Tunis | Tunisia | 879.63 |
23 | Universite de Yaounde I | Cameroons | 876.33 |
24 | Ecole Nationale d’Ingénieurs de Sfax | Tunisia | 822.31 |
25 | University of Ghana | Ghana | 804.53 |
26 | American University in Cairo | Egypt | 700.89 |
27 | Minia University | Egypt | 694.79 |
28 | University of Nairobi | Kenya | 671.72 |
29 | South Valley University | Egypt | 636.85 |
30 | University of Port Harcourt | Nigeria | 573.55 |
Wednesday, July 29, 2015
Google Scholar Ranking of African Universities
As competition in the ranking world intensifies, Times Higher Education (THE) and Quacquarelli Symonds are diligently promoting their various regional ranking, data processing and event management projects. The latest is THE's African summit at the University of Johannesburg.
Three weeks ago THE issued what they described as an "experimental and preliminary" ranking which consisted of 15 universities ordered according to the number of citations per paper normalised for field and year. An interesting innovation was that citations were fractionalised so that participants in large collaborative projects would be credited in proportion to their fraction of the total contributors .
This is just one indicator and it is not really a measure of research influence, but rather of research quality and it is still skewed by participation in multi-contributor papers in medicine and particle physics. It is unlikely that the University of Port Harcourt or Universite Cadi Ayyad would be in the top ten of any other indicator.
THE have indicated that they will add another 15 names to the list at the Johannesburg summit.
The table below was compiled for the purpose of checking on the claims of THE or other rankers that might attempt to evaluate African universities. It simply counts the number of results (2012-2014: exclude citations and patents) from a query to Google Scholar. Data was compiled on the 25th and 26th of July. The criteria for inclusion were being in the top 50 of the Webometrics rankings or the 15 universities in the THE list. The top university in any country not included was added from either the Webometrics or ic4u rankings.
This database includes papers, reports, theses and dissertations, conference proceedings and so on. It is certainly not a measure of research quality but rather of the volume of any activities connected with research. In the case of the two Kenyan universities it probably reflects the size and inclusiveness of the university repositories.
One thing about the Google scholar list is that it confirms suspicions that the quality of Egyptian universities has been underestimated by the big name rankers. For further evidence one might look at data from social media such as LinkedIn or just contrast the aspirations of Egyptian students in the revolutions of 2011 and 2013 compared with those of students at the University of Cape Town and Durban University of Technology.
Rank | University | Country | Google Scholar Results |
---|---|---|---|
1 | University of Cape Town | South Africa | 17,000 |
2 | Cairo University | Egypt | 16,800 |
3 | University of Pretoria | South Africa | 16,500 |
4 | University of Nairobi | Kenya | 16,400 |
5 | University of the Witwatersrand | South Africa | 15,800 |
6 | University of Kwazulu-Natal | South Africa | 15,500 |
7 | Stellenbosch University | South Africa | 14,900 |
8 | University of Ibadan | Nigeria | 14,800 |
9 | University of South Africa | South Africa | 13,500 |
10 | Kenyatta University | Kenya | 12,000 |
11 | University of Johannesburg | South Africa | 11,200 |
12 | Makerere University | Uganda | 10,400 |
13 | North West University | South Africa | 10,100 |
14 | University of Ghana | Ghana | 8,330 |
15 | Alexandria University | Egypt | 7,610 |
16 | University of Lagos | Nigeria | 7,220 |
17 | Rhodes University | South Africa | 7,210 |
18 | University of the Western Cape | South Africa | 6,870 |
19 | Obafemi Awolowo University | Nigeria | 6,800 |
20 | Mansoura University | Egypt | 6,480 |
21 | University of the Free State | South Africa | 6,400 |
22 | Addis Ababa University | Ethiopia | 6,210 |
23 | Zagazig University | Egypt | 6,160 |
24 | American University in Cairo | Egypt | 5,770 |
25 | University of Ilorin | Nigeria | 5,620 |
26 | Assiut University | Egypt | 5,580 |
27 | Kwame Nkrumah University of Science and Technology | Ghana | 5,080 |
28= | University of Zimbabwe | Zimbabwe | 4,830 |
28= | University of Port Harcourt | Nigeria | 4,830 |
30 | University of Botswana | Botswana | 4,260 |
31 | University of Zambia | Zambia | 4,240 |
32 | University of Dar Es Salaam | Tanzania | 4,120 |
33 | University of Khartoum | Sudan | 4,110 |
34 | Suez Canal University | Egypt | 3,670 |
35 | Tanta University | Egypt | 3,600 |
36 | Jomo Kenyatta University of Agriculture and Technology | Kenya | 3,520 |
37 | Nelson Mandela Metropolitan University | South Africa | 3,490 |
38 | Covenant University Ota | Nigeria | 2,950 |
39 | Helwan University | Egypt | 2,940 |
40 | Benha University | Egypt | 2,570 |
41 | Minia University | Egypt | 2,390 |
42 | University of Malawi | Malawi | 2,340 |
43 | Université Abou Bekr Belkaid | Algeria | 2,290 |
44 | Universityof Tunis | Tunisia | 2,270 |
45= | Université Kasdi Merbah Ouargla | Algeria | 2,240 |
45= | Cape Peninsula University of Technology | South Africa | 2,240 |
47 | Université Cheikh Anta Diop de Dakar | Senegal | 1,950 |
48 | University of Namibia | Namibia | 1,760 |
49 | Universite de la Reunion | Reunion | 1,690 |
50 | Durban University of Technology | South Africa | 1,560 |
51 | University of Mauritius | Mauritius | 1,490 |
52 | Université d'Abomey-Calavi | Benin | 1,460 |
53 | South Valley University | Egypt | 1,440 |
54 | Universidade Eduardo Mondlane | Mozambique | 1,420 |
55 | Beni-Suef University | Egypt | 1,400 |
56 | Université Cadi Ayyad Marrakech | Morroco | 1,370 |
57 | Université de Ouagadougou | Burkina Faso | 1,300 |
58 | University of Rwanda | Rwanda | 1,270 |
59 | Université des Sciences et de la Technologie Houari Boumediene | Algeria | 976 |
60 | Université de Lomé | Togo | 784 |
61 | Université de Bamako | Mali | 660 |
62 | Kafrelsheikh University | Egypt | 618 |
63 | University of Swaziland | Swaziland | 615 |
64 | Université Félix Houphouët-Boigny | Ivory Coast | 590 |
65 | Université de Kinshasa | Domocratic Republic of the Congo | 558 |
66 | National University of Lesotho | Lesotho | 555 |
67 | Université Constantine 1 | Algeria | 468 |
68 | Bejaia University | Algeria | 413 |
69 | Universidade Jean Piaget de Cabo Verde | Cape Verde | 407 |
70 | Université Mohammed V Souissi | Tunisia | 361 |
71 | National Engineering School of Sfax | Tunisia | 271 |
72 | Université Marien Ngouabi | Republc of Congo | 256 |
73 | University of Liberia | Liberia | 255 |
74 | Université Djillali Liabes | Algeria | 243 |
75 | Université Abdou Moumouni de Niamey | Niger | 206 |
76 | Misurata University | Libya | 155 |
77 | Université Omar Bongo | Gabon | 138 |
78 | University of The Gambia | Gambia | 130 |
79 | Universidade Católica de Angola | Angola | 115 |
80= | Université de Dschang | Cameroons | 113 |
80= | Université de Bangui | Central African Empire | 113 |
82 | Université de Nouakchott | Mauretania | 108 |
83 | Eritrea Institute of Technology | Eritrea | 76 |
84 | Université de Djibouti | Djibouti | 66 |
85 | Université de Toliara | Madagascar | 59 |
86 | Université Hassan II Ain Chock | Morocco | 55 |
87 | University of Seychelles | Seychelles | 52 |
88 | Mogadishu University | Somalia | 51 |
89 | Universidad Nacional de Guinea Ecuatorial | Equatorial Guinea | 40 |
90 | Universite Gamal Abdel Nasser de Conakry | Guinea | 21 |
91 | University of Makeni | Sierra Leone | 18 |
92 | John Garang Memorial University | South Sudan | 12 |
93 | Hope Africa University | Burundi | 3 |
94 | Universite de Moundou | Chad | 2 |
Tuesday, July 28, 2015
Would anyone notice if a small, old but awfully clever dog filled in a university ranking survey, and would it make a difference?
The Australian newspaper The Age has a piece by Erica Cervini on how she allowed her dog to complete the QS academic reputation survey on the quality of veterinary schools.
She doesn't elaborate on how the dog chose the schools. Was it by barking or tail wagging when shown pictures of the buildings?
Seriously though, she does have a point. Can QS stop people signing up just to support their employer or outvote their rivals?
To be fair, QS are aware that their surveys might be manipulated and have taken steps over the years to prevent this by such means as forbidding respondents from voting for their declared employer or repeat voting from the same computer. Even so, it seems that some universities, especially in Latin America, are getting scores in the reputation surveys that appear too high,especially when compared with their overall scores. In the employer survey the Pontifical Catholic University of Chile is 56th (overall 167) and the University of Buenos Aires 49th (overall 198). In the academic survey the University of Sao Paulo is 54th (overall 132 and the National Autonomous University of Mexico 55th (overall 175).
QS are apparently considering reforming their citations per faculty indicator and allowing unchanged responses to the surveys to be recycled for five instead of three years. This is welcome but a more rigorous overhaul of the reputation indicators is sorely needed.
She doesn't elaborate on how the dog chose the schools. Was it by barking or tail wagging when shown pictures of the buildings?
Seriously though, she does have a point. Can QS stop people signing up just to support their employer or outvote their rivals?
To be fair, QS are aware that their surveys might be manipulated and have taken steps over the years to prevent this by such means as forbidding respondents from voting for their declared employer or repeat voting from the same computer. Even so, it seems that some universities, especially in Latin America, are getting scores in the reputation surveys that appear too high,especially when compared with their overall scores. In the employer survey the Pontifical Catholic University of Chile is 56th (overall 167) and the University of Buenos Aires 49th (overall 198). In the academic survey the University of Sao Paulo is 54th (overall 132 and the National Autonomous University of Mexico 55th (overall 175).
QS are apparently considering reforming their citations per faculty indicator and allowing unchanged responses to the surveys to be recycled for five instead of three years. This is welcome but a more rigorous overhaul of the reputation indicators is sorely needed.
Thursday, July 23, 2015
Perfect Storm Heading for Tokyo Metropolitan University
Seen on the Times Higher Education website today:
Tokyo Metropolitan University got its perfect score largely because it was one of hundreds of institutions contributing to a few publications from the Large Hadron Collider project. In their recent experimental African rankings THE started using fractionalized counting of citations. If THE use this method in the coming world rankings then TMU will surely suffer a dramatic fall in the citations indicator.
I would not like to be the president of TMU on September 30th.
Tokyo Metropolitan University
World's Best University
Scored a Perfect 100.00 for Two Years in Citations Sector
From TMU to the World
Tokyo Metropolitan University got its perfect score largely because it was one of hundreds of institutions contributing to a few publications from the Large Hadron Collider project. In their recent experimental African rankings THE started using fractionalized counting of citations. If THE use this method in the coming world rankings then TMU will surely suffer a dramatic fall in the citations indicator.
I would not like to be the president of TMU on September 30th.
Wednesday, July 22, 2015
Recommended Reading
Anybody interested in educational policy, especially the never ending campaign to close gaps of one sort or another or the oddities of university rankings should take a look at chapter four of Jordan Ellenberg's How not to be wrong: The power of mathematical thinking which is about the obvious -- or ought to be obvious observation -- that smaller populations are more variable.
He notes that South Dakota is top of the league for brain cancer while North Dakota is near the bottom. What makes the difference? It is just that the bigger the population the more likely it is that outliers will be diluted by a great mass of mediocrity. So, extreme scores tend to crop up in small places or small samples.
Similarly when he tossed coins ten at a time he came up head counts ranging from 3 to 9 out of ten.
When he tossed them 100 at a time he got counts ranging from 45 to 60.
When he (actually his computer program) tossed them 1,000 times, the counts ranged from 462 to 537.
It is worth remembering this when a study with a double digit sample is published showing the latest way to close one of achievement gaps or a very small school in a rural state somewhere starts boosting the test scores of underperforming students or a few test takers reveal that the national IQ is imploding. Or the studies fail to be replicated, if indeed anyone tries.
Or university rankings that show very small or very unproductive institutions having an enormous research impact measured by citations.
He notes that South Dakota is top of the league for brain cancer while North Dakota is near the bottom. What makes the difference? It is just that the bigger the population the more likely it is that outliers will be diluted by a great mass of mediocrity. So, extreme scores tend to crop up in small places or small samples.
Similarly when he tossed coins ten at a time he came up head counts ranging from 3 to 9 out of ten.
When he tossed them 100 at a time he got counts ranging from 45 to 60.
When he (actually his computer program) tossed them 1,000 times, the counts ranged from 462 to 537.
It is worth remembering this when a study with a double digit sample is published showing the latest way to close one of achievement gaps or a very small school in a rural state somewhere starts boosting the test scores of underperforming students or a few test takers reveal that the national IQ is imploding. Or the studies fail to be replicated, if indeed anyone tries.
Or university rankings that show very small or very unproductive institutions having an enormous research impact measured by citations.
Saturday, July 18, 2015
In the QS BRICS rankings nearly everybody gets a prize
There is a growing trend towards specialised and regional university rankings. The magic of this is that they can provide something for almost everybody. QS recently published its latest BRICS rankings which combined data from five very different university systems. The result was a triumphant (almost) success for everybody (almost).
Here are some highlights that QS could use in selling the BRICS rankings or an expanded version.
Russian universities are ahead of everybody else for teaching quality.
The top 21 universities in the BRICS for Faculty Student Ratio (perhaps not a perfect proxy for teaching excellence) are Russian, headed by Bauman Moscow State Technical University. Imagine what Russian universities could do if QS recognised the importance of teaching and increased the weighting for this indicator.
India performs excellently for Faculty with a Ph D.
Out of the top 15 for this category, ten are Indian and all of these get the maximum score of 100. Of the other five, four are Brazilian and one Chinese. If only QS realised the importance of a highly qualified faculty, India would do much better in the overall rankings.
South Africa takes five out of the first six places for international faculty.
China has four out of five top places for academic reputation and employer reputation.
Meanwhile a Brazilian university is first for international faculty and another is third for academic reputation.
It seems that with rankings like these a lot depends on the weighting assigned to the various indicators.
Yes, going to the library might be good for you
A study by Felly Chiteng Kot and Jennifer L. Jones has found that
"using a given library resource was associated with a small, but also meaningful, gain in first-term grade point average, net of other factors."
But, correlation does not necessarily mean causation. Could it be that bright people like to go to libraries?
Still, most students are likely to behave better in the library than other places, so let's not quibble too much.
"using a given library resource was associated with a small, but also meaningful, gain in first-term grade point average, net of other factors."
But, correlation does not necessarily mean causation. Could it be that bright people like to go to libraries?
Still, most students are likely to behave better in the library than other places, so let's not quibble too much.
Tuesday, July 14, 2015
Implications of the THE African Pilot Ranking
The most interesting thing about THE's experimental African ranking is the use of fractionalised counting of citations. This means that the total number of citations is divided by the number of institutions contributing to a publication. Previously, the method used in THE rankings was to assign the total citations of a paper to all of the institutions that contributed just as though each one had been the only contributor. This has produced some very questionable results with universities that were excellent but very specialised or just generally unproductive scoring remarkably high scores for citations. Panjab university, Tokyo Metropolitan University, Scuola Normale Superiore Pisa, Federico Santa Maria Technical University and Moscow State Engineering Physics Institute have all had moments of glory in the THE rankings because of citation scores that were dramatically higher than their scores for research or any other indicator or group of indicators.
The new method, if applied generally, is likely to see a significant reduction in the scores given to such universities. We can estimate what might happen by looking at the four universities that are included in both the African pilot ranking and last year's world rankings, Cape Town, Witwatersrand, Stellenbosch and Universite Marrakesh Cadi Ayyad, Morocco.
In the world rankings these universities received scores of 86.6, 67.3, 45.6 and 83 respectively. The score of 83 for Universite Cadi Ayyad resulted very largely from its contributions to several publications from the Large Hadron Collider project, one of which has been cited over 2,000 times, a low overall output of papers and the "regional modification" that gave a big boost to low scoring countries. The scores for the three South African universities reflected a larger total output and citations over a broad range of disciplines.
In the African pilot ranking the scores for citations were Cape Town 90.90, Witwatersrand 99.76, Stellenbosch 95.48 and Cadi Ayyad 78.61. The high scores for the South African institutions reflect a much lower mean score than in the world rankings.
The fall in Cadi Ayyad's citation score from 3.6 points below Cape Town to 21.3 below and its falling behind Stellenbosch and Witwatersrand presumably reflect the impact of fractionalised counting.
This suggests that if fractionalised counting is used in the coming World University Rankings many small or specialised institutions will suffer and there will be a lot of reshuffling.
The new method, if applied generally, is likely to see a significant reduction in the scores given to such universities. We can estimate what might happen by looking at the four universities that are included in both the African pilot ranking and last year's world rankings, Cape Town, Witwatersrand, Stellenbosch and Universite Marrakesh Cadi Ayyad, Morocco.
In the world rankings these universities received scores of 86.6, 67.3, 45.6 and 83 respectively. The score of 83 for Universite Cadi Ayyad resulted very largely from its contributions to several publications from the Large Hadron Collider project, one of which has been cited over 2,000 times, a low overall output of papers and the "regional modification" that gave a big boost to low scoring countries. The scores for the three South African universities reflected a larger total output and citations over a broad range of disciplines.
In the African pilot ranking the scores for citations were Cape Town 90.90, Witwatersrand 99.76, Stellenbosch 95.48 and Cadi Ayyad 78.61. The high scores for the South African institutions reflect a much lower mean score than in the world rankings.
The fall in Cadi Ayyad's citation score from 3.6 points below Cape Town to 21.3 below and its falling behind Stellenbosch and Witwatersrand presumably reflect the impact of fractionalised counting.
This suggests that if fractionalised counting is used in the coming World University Rankings many small or specialised institutions will suffer and there will be a lot of reshuffling.
Thursday, July 09, 2015
The Top University in Africa is ...
... the University of Cape Town. What a surprise!
Times Higher Education (THE) has produced another "snapshot" ranking. This one is a list of 15 African universities ranked according to "research influence", that is the number of citations per paper normalised by field and year. It seems that a larger list will be published at a THE summit at the university of Johannesburg scheduled for the end of this month. Then, apparently, there will be discussions about full rankings with a broad range of indicators.
This is a smart move. Apart from diluting the impact of the QS BRICS rankings, this table puts the summit host in the top ten and gets attention from around the continent with three places in the north, two in the west and two in the east in the top fifteen.
Here is the top 15:
1. University of Cape Town, South Africa
2. University of the Witwatersrand, South Africa
3. Makerere University, Uganda
4. Stellenbosch University, South Africa
5. University of Kwazulu-Natal, South Africa
6. University of Port Harcourt, Nigeria
7. University of the Western Cape, South Africa
8. University of Nairobi, Kenya
9. University of Johannesburg, South Africa
10. Universite Cadi Ayyad, Morrocco
11. University of Pretoria, South Africa
12. University of Ghana
13. University of South Africa
14. Suez Canal University, Egypt
15. Universite Hassan II, Morrocco.
This is, of course, just one indicator but even so there will be a few academic eyebrows rising around the continent. Makerere has a good national and regional reputation but does it have more research influence than all but two South African universities?
How come Suez Canal University is there but not Cairo University or the American University in Cairo? And I am sure that in Nigeria there will be a lot of smirking around Ahmadu Bello and Ibadan Universities about Port Harcourt in sixth place.
One very good thing about this "experimental and preliminary ranking" is that THE and data provider Scopus are now using fractionalised counting of citations, so that if 100 universities contribute to a publication they each get credit for one hundredth of the citations.
That has not stopped Makerere and Port Harcourt from getting a boost, perhaps too much of a boost, for taking part in a huge multinational medical study but it has reduced the distortions that this indicator can cause.
So, for once, well done THE!... Now, what about taking a look at secondary affiliations?
Saturday, July 04, 2015
Is this really happening?
If this continues France will be the least intelligent country in the world in a century.
Drawing straight lines on graphs and getting excited about tiny samples can be dangerous. Even so, this is a little frightening.
James Thompson's blog notes a study by Edward Dutton and Richard Lynn that suggests that the French national average IQ declined by nearly 4 points in a decade. The sample size was only 79 so we should not start panicking too much until there are a few more studies. It will be interesting to see if this one is replicated.
Drawing straight lines on graphs and getting excited about tiny samples can be dangerous. Even so, this is a little frightening.
James Thompson's blog notes a study by Edward Dutton and Richard Lynn that suggests that the French national average IQ declined by nearly 4 points in a decade. The sample size was only 79 so we should not start panicking too much until there are a few more studies. It will be interesting to see if this one is replicated.
Thursday, July 02, 2015
Now the British Academy Sees a Problem
Yesterday I referred to the poor numeracy skills of British (England and Northern Ireland) tertiary graduates reported by the OECD.
Now the British Academy has had its say. It reports that the performance of British school pupils is mediocre and that many undergraduate students are weak in statistics.
But it looks like middling (compared to the OECD average) secondary school students become almost rock bottom (in the OECD) tertiary graduates. Could it be that British universities are actually subtracting relative value from their students?
The Academy notes:
The Academy proposes a strategy that includes improving the quality of quantitative skills teaching, reviewing school curricula and addressing the early dropping of maths by secondary school students.
I suspect that that will be insufficient.
Now the British Academy has had its say. It reports that the performance of British school pupils is mediocre and that many undergraduate students are weak in statistics.
But it looks like middling (compared to the OECD average) secondary school students become almost rock bottom (in the OECD) tertiary graduates. Could it be that British universities are actually subtracting relative value from their students?
The Academy notes:
Our school pupils tend to be ranked only in the middle of developed nations in mathematics. Our undergraduates embark on degree courses with varying, and often weak, fluency in statistics. And, in the workplace, demand for more advanced quantitative skills has risen sharply in the past two decades.Perhaps this has something to do with relatively high graduation rates at British universities so that mediocre students with weak numeracy skills will be recorded as tertiary graduates while their counterparts in most of the OECD will drop out and remain classified as secondary graduate. Even if that were the case the mediocrity of secondary students and tertiary graduates would still need to be addressed.
The Academy proposes a strategy that includes improving the quality of quantitative skills teaching, reviewing school curricula and addressing the early dropping of maths by secondary school students.
I suspect that that will be insufficient.
Wednesday, July 01, 2015
Today India, Tomorrow Japan, Then ....
The ranking businesses are extending their global tentacles. Times Higher Education has produced a "snapshot" MENA ranking that produced interesting results -- Texas A&M University Qatar top for research impact -- and will be announcing their world rankings from Melbourne.
Meanwhile, QS will be in India next week to reveal their latest BRICS rankings and has been getting attention in new places for its subject rankings that get to places other rankings won't go.
With QS getting more international, it is no surprise to hear that Mitsui & Co, Ltd, has purchased shares in QS:
Meanwhile, QS will be in India next week to reveal their latest BRICS rankings and has been getting attention in new places for its subject rankings that get to places other rankings won't go.
With QS getting more international, it is no surprise to hear that Mitsui & Co, Ltd, has purchased shares in QS:
'Nunzio Quacquarelli, CEO of QS said the investment from Mutsui “can especially support our development in Asia” adding, “we were seeking and have found a likeminded company which shares our long term vision” '.This is not the first sign of Mitsui's interest in tertiary education:
'Last year the company also invested $5m in Synergis Education, an education company specialising in online and on the ground adult learner programmes.
“We aim to use our experience in the online education field to create new services,” said Takeshi Akutsu, GM of Mitsui Service Business Division in a statement.
“At the same time, through this business we will help to nurture the global human resources needed by the global economy.” '
I wonder if QS will try and start ranking online courses.next.
Tuesday, June 30, 2015
What's the Problem with U-Multirank and AHELO?
In a recent post, I discussed the contrast between the poor skills of young people in the UK (strictly speaking Northern Ireland and England) and the high regard in which British universities are held by the brand name rankers.
There is a piece of data in the skills report from the OECD that is interesting in this respect. Figure 2.2 shows the average numeracy skills of new graduates (age 16-29, 2012). It is depressing reading. The data for tertiary graduates shows that only Italy does worse than the UK and Ireland is either the same or almost the same.The US is very slightly ahead. The top scorers are Austria, Flanders and the Czech Republic.
Something that should have everybody running around doing research and forming committees is that British tertiary graduates are only very slightly better than most European secondary graduates and slightly better than South Koreans with less than an upper secondary qualification.
It is possible, indeed quite probable, that British tertiary graduates do better on verbal skills and likely that they could conduct themselves well in interviews. Perhaps also, it is places like the University of East London and Bolton University that are dragging down the British average. But this dramatically poor performance is such a glaring contrast to the preening self satisfaction of the higher education establishment that some discussion at least is called for.
We may be seeing an explanation for the reluctance of the Russell Group and its orbiters and the Ivy League to cooperate with U-Multirank and their disdain for the AHELO project that is in marked contrast with their support for the trusted and prestigious THE rankings. They are quite happy to be assessed on reputation, resources, income and citations but comparison with the cognitive skills of graduates from the upstarts of East Asia and perhaps Eastern and Central Europe is something to be avoided.
There is a piece of data in the skills report from the OECD that is interesting in this respect. Figure 2.2 shows the average numeracy skills of new graduates (age 16-29, 2012). It is depressing reading. The data for tertiary graduates shows that only Italy does worse than the UK and Ireland is either the same or almost the same.The US is very slightly ahead. The top scorers are Austria, Flanders and the Czech Republic.
Something that should have everybody running around doing research and forming committees is that British tertiary graduates are only very slightly better than most European secondary graduates and slightly better than South Koreans with less than an upper secondary qualification.
It is possible, indeed quite probable, that British tertiary graduates do better on verbal skills and likely that they could conduct themselves well in interviews. Perhaps also, it is places like the University of East London and Bolton University that are dragging down the British average. But this dramatically poor performance is such a glaring contrast to the preening self satisfaction of the higher education establishment that some discussion at least is called for.
We may be seeing an explanation for the reluctance of the Russell Group and its orbiters and the Ivy League to cooperate with U-Multirank and their disdain for the AHELO project that is in marked contrast with their support for the trusted and prestigious THE rankings. They are quite happy to be assessed on reputation, resources, income and citations but comparison with the cognitive skills of graduates from the upstarts of East Asia and perhaps Eastern and Central Europe is something to be avoided.
Saturday, June 27, 2015
Why Russia Might Rise Fairly Quickly in the Rankings After Falling a Bit
An article by Alex Usher in Higher Education in Russia and Beyond, reprinted in University World News, suggests five structural reasons why Russian universities will not rise very quickly in the global rankings. These are:
- the concentration of resources in academies rather than universities
- excessive specialisation among existing universities
- a shortage of researchers caused by the economic crisis of the nineties
- excessive bureaucratic control over research projects
- limited fluency in English.
Over the next couple of years things might even get a bit worse. QS are considering introducing a sensible form of field normalisation, just for the five main subject groups. This might not happen since they are well aware of the further advantages this will give to English speaking universities, especially Oxbridge and places like Yale and Princeton, that are strong in the humanities and social sciences. But if it did it would not be good for Russian universities. Meanwhile, THE has spoken about doing something about hugely cited multi-authored physics papers and that could drastically affect institutions like MEPhI.
But after that, there are special features in the QS and THE world rankings that could be exploited by Russian universities.
Russia is surrounded by former Soviet countries where Russian is widely used and which could provide large numbers of international research collaborators, an indicator in the THE rankings, and could be a source of international students and faculty, indicators in the THE and QS rankings and a source of respondents to the THE and QS academic surveys.
Russia might also consider tapping the Chinese supply of bright students for STEM subjects. It is likely that the red bourgeoisie will start wondering about the wisdom of sending their heirs to universities that give academic credit for things like walking around with a mattress or not shaving armpit hair and think about a degree in engineering from Moscow State or MEPhI.
Russian universities also appear to have a strong bias towards applied sciences and vocational training that should, if marketed properly, produce high scores in the QS employer survey and the THE Industry Income: Innovation indicator.
Subscribe to:
Posts (Atom)