The top results from a Google search for responses to the recently published Round Universities Rankings
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Saturday, September 19, 2015
Who's interested in the Round University Rankings?
The top results from a Google search for responses to the recently published Round Universities Rankings
Who's Interrested in the Shanghai Rankings?
First results from a Google search for responses to the latest edition of the Shanghai world rankings.
Radboud University: 132nd place on ARWU/Shanghai ranking 2015
Friday, September 18, 2015
The Italian Ranking Dance
As noted in the previous post, the latest QS world rankings have not been well received by the Italian blog ROARS. Their opinion of the reaction of the Italian media and public was summarised by posting the following video
Who believes QS?
From the Italian site ROARS: Return on Academic Research (Google translation):
According to the ranking Quacquarelli Symonds (QS) in Siena that something would happen in a year they have lost 220 (two hundred) rankings. But Pavia and Turin have collapsed by over 150 people coming out of the top-500; They have lost more than 100 positions Pisa, Tor Vergata, Federico II of Naples, Milan Catholic, Genoa, Perugia and Bicocca. The meltdown is simply due to the fact that QS has changed the methodology used in its construction ranking. Gaining places only the Polytechnics of Milan and Turin, as provided by Richard Holmes more than a month ago, when the news was spread of the change in methodology. I hope that the collapse of the Italian university in 2015 "certified" by QS and caused by the change of methodology, be a lesson: the rankings are not a serious way to evaluate the performance of universities.Unfortunately, judging from press and of POLIMI POLITO, it seems that the lesson has been helpful.
According to the ranking Quacquarelli Symonds (QS) in Siena that something would happen in a year they have lost 220 (two hundred) rankings. But Pavia and Turin have collapsed by over 150 people coming out of the top-500; They have lost more than 100 positions Pisa, Tor Vergata, Federico II of Naples, Milan Catholic, Genoa, Perugia and Bicocca. The meltdown is simply due to the fact that QS has changed the methodology used in its construction ranking. Gaining places only the Polytechnics of Milan and Turin, as provided by Richard Holmes more than a month ago, when the news was spread of the change in methodology. I hope that the collapse of the Italian university in 2015 "certified" by QS and caused by the change of methodology, be a lesson: the rankings are not a serious way to evaluate the performance of universities.Unfortunately, judging from press and of POLIMI POLITO, it seems that the lesson has been helpful.
Thursday, September 17, 2015
University Quality and Bias
Anticipating requests, here is the link for a significant paper by Christopher Claassen of the University of Essex.
Measuring University Quality by Christopher Claassen
http://www.chrisclaassen.com/University_rankings.pdf
Measuring University Quality by Christopher Claassen
http://www.chrisclaassen.com/University_rankings.pdf
Tuesday, September 15, 2015
Auto-Induced Fly Catching
Taking a break from the most exciting or second most exciting educational event this month, I have just received a message from Google Scholar Citations asking if I wanted to add the following to my profile:
Unfortunately, I couldn't claim credit for this work. In 1972 I was still immersed in the Irish Home Rule Debate and the rise of the Sokoto Caliphate.
I wonder whether this was an attempt to develop an environmentally friendly form of pest control or is this a serious mental disorder among certain kinds of dogs?
Is it too late to submit a nomination for the IgNobel awards?
JG LANE, RJ Holmes
BRITISH VETERINARY JOURNAL 128 (9), 477-&, 1972
Unfortunately, I couldn't claim credit for this work. In 1972 I was still immersed in the Irish Home Rule Debate and the rise of the Sokoto Caliphate.
I wonder whether this was an attempt to develop an environmentally friendly form of pest control or is this a serious mental disorder among certain kinds of dogs?
Is it too late to submit a nomination for the IgNobel awards?
Tuesday, September 08, 2015
Global Ranking From Russia
A very interesting new set of global rankings appeared seven days ago, the Round University Ranking from Russia. The organization is rather mysterious, although probably not so much in Russia and nearby places.
The rankings are based entirely on data from Thomson Reuters (TR) and the structure and methodology are similar to last year's Times Higher Education (THE) World University Rankings. They include 12 out of the 13 indicators used in the 2014 THE rankings, with only the percentage of research income derived from industry omitted. There are eight more measures making a total of twenty, five each for teaching, research, international diversity and financial sustainability.
There is a normalized citations indicator with a weighting of only eight per cent, balanced by a simple count of citations per academic and research staff, also with eight per cent.
Altogether the three reputation indicators count for 18 per cent of the weighting compared to 33 per cent in the 2014 THE rankings or 50 per cent in the Quacquarelli Symonds (QS) world rankings
To date these rankings appear to have been ignored by the world media except in Russia and its neighbors. Compared to the excitement with which the THE or even the QS or Shanghai rankings are greeted this might seem a bit odd. If the THE rankings were sophisticated because they had 13 indicators then these are even more so with 20. If the THE rankings were trusted because they were powered by Thomson Reuters so are these. If the survey in the THE rankings was real social science then so is this.
Could it be that the THE rankings are beloved of the Russell Group and its like around the world not because of their robustness, comprehensiveness, transparency or superior methodology but because of the glamour derived from a succession of prestigious events, networking dinners and exclusive masterclasses designed to appeal to the status anxieties of upwardly or downwardly mobile university administrators?
There are some problems with the RUR rankings. There is incoherence about what the indicators are supposed to measure.The methodology says that '[I]t is assumed that "undergraduate" level is the core of higher education' so there is an indicator measuring academic staff per bachelor degree. But then we have a weighting of eight per cent for doctoral degrees per bachelor degrees.
One excellent thing about these rankings is that the score for all of the indicators can be found in the profiles of the individual universities. If anyone has the energy and time there are some important questions that could be answered . Is the correlation between teaching and research reputation so high that a distinction between the two is redundant? Is income or number of faculty a better prediction of research performance?
The presentation leaves a lot to be desired. Cooper League? The explanation of the methodology verges on the incomprehensible. Can somebody tell RUR to get a competent human to translate for them and forget about the Google Translator?
The economics of the relationship between TR and RUR are puzzling. There are no obvious signs that RUR has a large income from which to pay TR for the data and I doubt that TR has passed it on for purely altruistic reasons. Could it be that TR are simply trying to undercut THE's attempt to go it alone? If nothing else, it could undermine any THE plans to go into the benchmarking and consulting trade.
Anyway, here are some first places. No surprises here, except maybe for Scuola Superiore Normale Pisa. You can find out exactly where the strengths of that school are by checking the scores for the twenty indicators.
Overall: Harvard
Teaching: Caltech
Research: Chicago
International Diversity: EPF Lausanne
Financial Sustainability: Caltech
China: Peking 49th
Russia: Moscow State 187th
India: IIT Kharagpur 272nd
UK: ICL 5th
Germany: Munich 22nd
France: Ecole Polytechnique 19th
Egypt: American University Cairo 571st
South Africa: Cape Town 201st
Brazil: Sao Paulo 65th
Italy: Scuola Normale Superiore Pisa 66th
Turkey: METU 308th
Malaysia: Universiti Putra Malaysia 513th
Australia: ANU 77th
Japan: Tokyo 47th
Korea: KAIST 41st.
The rankings are based entirely on data from Thomson Reuters (TR) and the structure and methodology are similar to last year's Times Higher Education (THE) World University Rankings. They include 12 out of the 13 indicators used in the 2014 THE rankings, with only the percentage of research income derived from industry omitted. There are eight more measures making a total of twenty, five each for teaching, research, international diversity and financial sustainability.
There is a normalized citations indicator with a weighting of only eight per cent, balanced by a simple count of citations per academic and research staff, also with eight per cent.
Altogether the three reputation indicators count for 18 per cent of the weighting compared to 33 per cent in the 2014 THE rankings or 50 per cent in the Quacquarelli Symonds (QS) world rankings
To date these rankings appear to have been ignored by the world media except in Russia and its neighbors. Compared to the excitement with which the THE or even the QS or Shanghai rankings are greeted this might seem a bit odd. If the THE rankings were sophisticated because they had 13 indicators then these are even more so with 20. If the THE rankings were trusted because they were powered by Thomson Reuters so are these. If the survey in the THE rankings was real social science then so is this.
Could it be that the THE rankings are beloved of the Russell Group and its like around the world not because of their robustness, comprehensiveness, transparency or superior methodology but because of the glamour derived from a succession of prestigious events, networking dinners and exclusive masterclasses designed to appeal to the status anxieties of upwardly or downwardly mobile university administrators?
There are some problems with the RUR rankings. There is incoherence about what the indicators are supposed to measure.The methodology says that '[I]t is assumed that "undergraduate" level is the core of higher education' so there is an indicator measuring academic staff per bachelor degree. But then we have a weighting of eight per cent for doctoral degrees per bachelor degrees.
One excellent thing about these rankings is that the score for all of the indicators can be found in the profiles of the individual universities. If anyone has the energy and time there are some important questions that could be answered . Is the correlation between teaching and research reputation so high that a distinction between the two is redundant? Is income or number of faculty a better prediction of research performance?
The presentation leaves a lot to be desired. Cooper League? The explanation of the methodology verges on the incomprehensible. Can somebody tell RUR to get a competent human to translate for them and forget about the Google Translator?
The economics of the relationship between TR and RUR are puzzling. There are no obvious signs that RUR has a large income from which to pay TR for the data and I doubt that TR has passed it on for purely altruistic reasons. Could it be that TR are simply trying to undercut THE's attempt to go it alone? If nothing else, it could undermine any THE plans to go into the benchmarking and consulting trade.
Anyway, here are some first places. No surprises here, except maybe for Scuola Superiore Normale Pisa. You can find out exactly where the strengths of that school are by checking the scores for the twenty indicators.
Overall: Harvard
Teaching: Caltech
Research: Chicago
International Diversity: EPF Lausanne
Financial Sustainability: Caltech
China: Peking 49th
Russia: Moscow State 187th
India: IIT Kharagpur 272nd
UK: ICL 5th
Germany: Munich 22nd
France: Ecole Polytechnique 19th
Egypt: American University Cairo 571st
South Africa: Cape Town 201st
Brazil: Sao Paulo 65th
Italy: Scuola Normale Superiore Pisa 66th
Turkey: METU 308th
Malaysia: Universiti Putra Malaysia 513th
Australia: ANU 77th
Japan: Tokyo 47th
Korea: KAIST 41st.
Sunday, September 06, 2015
More on Alternative Indicators for Ranking African Universities
Continuing with our exploration of how to rank universities outside the world's top 200 or 400 where it is necessary to develop robust and sophisticated techniques of standardisation, normalisation, scaling, regional modification, taking away the number you first thought of (just kidding) verification, weighting and validation to figure out that Caltech's normalised research impact is slightly better than Harvard's or that Cambridge is a bit more international than that place in the other Cambridge, here is a ranking of African universities according to recommendations in LinkedIn.
There are obvious problems with this indicator, not least of which is the tiny number of responses compared to all the students on the continent. It might, however, be the precursor to a useful survey of student opinion or graduate employability later on.
First place goes to the University of South Africa, an open distance education institution whose alumni include Nelson Mandela, Cyril Ramaphosa and Jean-Bertrand Aristide. Makerere University, the University of Nairobi and Kenyatta University do well.
Data was compiled on the 28th and 29th of July. All universities included in the THE experimental African ranking, the top fifty African universities in Webometrics plus the top universities in Webometrics or 4icu of any country still nor included.
Rank | University | Country | Number of LinkedIn Recommendations |
---|---|---|---|
1 | University of South Africa | South Africa | 154 |
2 | Makerere University | Uganda | 116 |
3 | University of the Witwatersrand | South Africa | 94 |
4 | University of Ibadan | Nigeria | 86 |
5 | University of Johannesburg | South Africa | 79 |
6 | University of Nairobi | Kenya | 75 |
7 | Cairo University | Egypt | 67 |
8 | Stellenbosch University | South Africa | 63 |
9 | University of Pretoria | South Africa | 62 |
10 | Kenyatta University | Kenya | 61 |
11 | University of Cape Town | South Africa | 60 |
12 | University of Lagos | Nigeria | 58 |
13 | Addis Ababa University | Ethiopia | 55 |
14 | Obafemi Awolowo University | Nigeria | 50 |
15 | Alexandria University | Egypt | 47 |
16 | Rhodes University | South Africa | 42 |
17 | Jomo Kenyatta University of Agriculture and Technology | Kenya |
40
|
18 | American University in Cairo | Egypt | 28 |
19 | University of Kwazulu-Natal | South Africa | 26 |
20 | University of Ilorin | Nigeria | 24 |
21 | University of Zimbabwe | Zimbabwe | 22 |
22 | Kwame Nkrumah University of Science and Technology | Ghana | 21 |
23 | Helwan University | Egypt |
20
|
24= | North West University | South Africa | 18 |
24= | University of Ghana | Ghana | 18 |
24= | University of Port Harcourt | Nigeria | 18 |
27= | Durban University of Technology | South Africa | 16 |
27= | University of Dar Es Salaam | Tanzania | 16 |
29= | Nelson Mandela Metropolitan University | South Africa | 14 |
29= | University of the Western Cape | South Africa | 14 |
31 | Cape Peninsula University of Technology | South Africa | 13 |
32 | Mansoura university | Egypt | 12 |
33 | University of Botswana | Botswana | 10 |
34 | Covenant University | Nigeria | 9 |
35= | Zagazig University | Egypt | 7 |
35= | Suez Canal university | Egypt | 7 |
37 | Tanta University | Egypt | 6 |
38= | Assiut University | Egypt | 5 |
38= | UniversitƩ Constantine 1 | Algeria | 5 |
40= | University of the Free State | South Africa | 4 |
40= | Universite des Sciences et de la Technologie Houari Boumediene | Algeria |
4
|
42+ | South Valley University | Egypt | 3 |
42+ | UniversitƩ Cadi Ayyad | Morocco | 2 |
42+ | University ofTunis | Tunisia | 2 |
42+ | University of Namibia | Namibia | 1 |
42+ | University of Mauritius | Mauritius | 1 |
42+ | UniversitƩ Cheikh Anta Diop | Senegal | 0 |
42+ | UniversitƩ Mohammed V Souissi | Morocco | 0 |
42+ | University of Khartoum | Sudan | 0 |
42+ | University of Malawi | Malawi | 0 |
42+ | UniversitƩ Hassan II Ain Chock | Morocco | 0 |
42+ | Kafrelsheikh University | Egypt | 0 |
42+ | University of Zambia | Zambia | 0 |
42+ | Bejaia university | Algeria | 0 |
42+ | Minia University | Egypt | 0 |
42+ | Benha University | Egypt | 0 |
42+ | Universidade CatĆ³lica de Angola | Angola | 0 |
42+ | UniversitƩ de LomƩ | Togo | 0 |
42+ | UniversitƩ Abou Bekr Belkaid | Algeria | 0 |
42+ | Beni-Suef University | Egypt | 0 |
42+ | UniversitƩ Omar Bongo | Gabon | 0 |
42+ | University of the Gambia | Gambia | 0 |
42+ | UniversitƩ de Toliara | Madagascar | 0 |
42+ | UniversitƩ Kasdi Merbah Ouarg | Algeria | 0 |
42+ | Universite de la Reunion | Reunion | 0 |
42+ | UniversitƩ d'Abomey-Calavi | Benin | 0 |
42+ | Universidade Eduardo Mondlane | Mozambique | 0 |
42+ | UniversitƩ de Ouagadougou | Burkina Faso | 0 |
42+ | University of Rwanda | Rwanda | 0 |
42+ | Universite de Bamako | Mali | 0 |
42+ | University of Swaziland | Swaziland | 0 |
42+ | UniversitƩ FƩlix Houphouƫt-Boigny | Ivory Coast | 0 |
42+ | UniversitƩ de Kinshasa | Democratic Republic of the Congo | 0 |
42+ | National University of Lesotho | Lesotho |
0
|
42+ | Universidade Jean Piaget de Cabo Verde | Cape Verde | 0 |
42+ | N Engineering S of Sfax | Tunisia | 0 |
42+ | UniversitƩ Marien Ngouabi | Republic of the Congo |
0
|
42+ | University of Liberia | Liberia | 0 |
42+ | UniversitƩ Djillali Liabes | Algeria | 0 |
42+ | UniversitƩ Abdou Moumouni de Niamey | Niger | 0 |
42+ | Misurata University | Libya | 0 |
42+ | UniversitƩ de Dschang | Cameroons | 0 |
42+ | UniversitƩ de Bangui | Central African Republic | 0 |
42+ | UniversitƩ de Nouakchott | Mauritania | 0 |
42+ | Eritrea Institute of Technology | Eritrea | 0 |
42+ | UniversitƩ de Djibouti | Djibouti | 0 |
42+ | University of Seychelles | Seychelles | 0 |
42+ | Mogadishu University | Somalia | 0 |
42+ | Universidad Nacional de Guinea Ecuatorial | Equatorial Guinea | 0 |
42+ | Universite Gamal Abdel Nasser de Conakry | Guinea | 0 |
42+ | University of Makeni | Sierra Leone | 0 |
42+ | John Garang Memorial University | South Sudan | 0 |
42+ | Hope Africa University | Burundi | 0 |
42+ | Universite de Moundou | Chad | 0 |
42+ | Universite de Yaounde I | Cameroons | 0 |
Tuesday, September 01, 2015
Best German and Austrian Universities if you Want to get Rich
If you want to go a university in Germany or Austria and get rich afterwards, the website Wealth-X has a ranking for you. It counts the number of UHNW (ultra high net worth) alumni, those with US$ 30 million or above.
Here are the top five with the number of UHNW individuals in brackets.
1. University of Cologne (18)
2. University of Munich (14)
3. University of Hamburg (13)
4. University of Freiburg (11)
5. University of Bonn (11)
There may well be protests about who should be first. In tenth place is "Ludwig Maximilians University Munich (LMU Munich)", which I assume is another name for the University of Munich, with six UHNW alumni .
Here are the top five with the number of UHNW individuals in brackets.
1. University of Cologne (18)
2. University of Munich (14)
3. University of Hamburg (13)
4. University of Freiburg (11)
5. University of Bonn (11)
There may well be protests about who should be first. In tenth place is "Ludwig Maximilians University Munich (LMU Munich)", which I assume is another name for the University of Munich, with six UHNW alumni .
Monday, August 31, 2015
Update on changes in ranking methodology
Times Higher Education (THE) have been preparing the ground for methodological changes in their world rankings. A recent article by Phil Baty announced that the new world rankings scheduled for September 30 will not count the citations to 649 papers, mainly in particle physics, with more than 1000 authors.
This is perhaps the best that is technically and/or commercially feasible at this moment but it is far from satisfactory. Some of these publications are dealing with the most basic questions about the nature of physical reality and it is a serious distortion not to include them in the ranking methodology. There have been complaints about this. Pavel Krokovny's comment was noted in a previous post while Mete Yeyisoglu argues that:
Baty provides a reasonable answer to the question why the cut-off point is 1,000 authors.
But there is a fundamental issue developing here that goes beyond ranking procedure. The concept of authorship of a philosophy paper written entirely by a single person or a sociological study from a small research team is very different from that of the huge multi-national capital and labour intensive publications in which the number of collaborating institutions exceeds the number of paragraphs and there are more authors than sentences.
Fractional counting does seem to be the only fair and sensible way forward and it is now apparently on THE's agenda although they have still not committed themselves.
The objection could be raised that while the current THE system gives a huge reward to even the least significant contributing institution, fractional counting would give major research universities insufficient credit for their role in important research projects.
A long term solution might be to draw a distinction between the contributors to and the authors of the mega papers. For most publications there would be no need to draw such a distinction but for those with some sort of input from dozens, hundreds or thousands of people it might be feasible for to allot half the credit to all those who had anything to do with the project and the other half to those who meet the standard criteria of authorship. There would no doubt be a lot of politicking about who gets the credit but that would be nothing new.
Duncan Ross, the new Data and Analytics Director at THE, seems to be thinking along these lines.
THE now appear to agree that this is indefensible in the long run and hope that a more inclusive academic survey and the shift to Scopus, with broader coverage than the Web of Science, will lead to this adjustment being phased out.
It is a bit odd that TR and THE should have introduced income, in three separate indicators, and international outlook, in another three, as markers of excellence, but then included a regional modification to compensate for limited funding and international contacts.
THE are to be congratulated for having put fractional counting and phasing out the regional modification on their agenda. Let's hope it doesn't take too long.
While we are on the topic, there are some more things about the citation indicator to think about . First, to repeat a couple of points mentioned in the earlier post.
Some other things THE could think about.
This is perhaps the best that is technically and/or commercially feasible at this moment but it is far from satisfactory. Some of these publications are dealing with the most basic questions about the nature of physical reality and it is a serious distortion not to include them in the ranking methodology. There have been complaints about this. Pavel Krokovny's comment was noted in a previous post while Mete Yeyisoglu argues that:
"Fractional counting is the ultimate solution. I wish you could have worked it out to use fractional counting for the 2015-16 rankings.
The current interim approach you came up with is objectionable.
Why 1,000 authors? How was the limit set? What about 999 authored-articles?
Although the institution I work for will probably benefit from this interim approach, I think you should have kept the same old methodology until you come up with an ultimate solution.
This year's interim fluctuation will adversely affect the image of university rankings."
Baty provides a reasonable answer to the question why the cut-off point is 1,000 authors.
But there is a fundamental issue developing here that goes beyond ranking procedure. The concept of authorship of a philosophy paper written entirely by a single person or a sociological study from a small research team is very different from that of the huge multi-national capital and labour intensive publications in which the number of collaborating institutions exceeds the number of paragraphs and there are more authors than sentences.
Fractional counting does seem to be the only fair and sensible way forward and it is now apparently on THE's agenda although they have still not committed themselves.
The objection could be raised that while the current THE system gives a huge reward to even the least significant contributing institution, fractional counting would give major research universities insufficient credit for their role in important research projects.
A long term solution might be to draw a distinction between the contributors to and the authors of the mega papers. For most publications there would be no need to draw such a distinction but for those with some sort of input from dozens, hundreds or thousands of people it might be feasible for to allot half the credit to all those who had anything to do with the project and the other half to those who meet the standard criteria of authorship. There would no doubt be a lot of politicking about who gets the credit but that would be nothing new.
Duncan Ross, the new Data and Analytics Director at THE, seems to be thinking along these lines.
"In the longer term there are one technical and one structural approach that would be viable. The technical approach is to use a fractional counting approach (2932 authors? Well you each get 0.034% of the credit). The structural approach is more of a long term solution: to persuade the academic community to adopt metadata that adequately explains the relationship of individuals to the paper that they are ‘authoring’. Unfortunately I’m not holding my breath on that one."The counting of citations to mega papers is not the only problem with the THE citations indicator. Another is the practice of giving a boost to universities in underperforming countries. Another item by Phil Baty quotes this justification from Thomson Reuters, THE's former data partner.
“The concept of the regional modification is to overcome the differences between publication and citation behaviour between different countries and regions. For example some regions will have English as their primary language and all the publications will be in English, this will give them an advantage over a region that publishes some of its papers in other languages (because non-English publications will have a limited audience of readers and therefore a limited ability to be cited). There are also factors to consider such as the size of the research network in that region, the ability of its researchers and academics to network at conferences and the local research, evaluation and funding policies that may influence publishing practice.”
THE now appear to agree that this is indefensible in the long run and hope that a more inclusive academic survey and the shift to Scopus, with broader coverage than the Web of Science, will lead to this adjustment being phased out.
It is a bit odd that TR and THE should have introduced income, in three separate indicators, and international outlook, in another three, as markers of excellence, but then included a regional modification to compensate for limited funding and international contacts.
THE are to be congratulated for having put fractional counting and phasing out the regional modification on their agenda. Let's hope it doesn't take too long.
While we are on the topic, there are some more things about the citation indicator to think about . First, to repeat a couple of points mentioned in the earlier post.
- Reducing the number of fields or doing away with normalisation by year of citation. The more boxes into which any given citation can be dropped the greater the chance of statistical anomalies when a cluster of citations meets a low world average of citations for that particular year of citations, year of publication and field (300 in Scopus?)
- Reducing the weighting for this indicator. Perhaps citations per paper normalized by field is a useful instrument for comparing the quality of research of MIT, Caltech, Harvard and the like but it might be of little value when comparing the research performance of Panjab University and IIT Bombay or Istanbul University and Bogazici.
Some other things THE could think about.
- Adding a measure of overall research impact, perhaps simply by counting citations. At the very least stop calling field- and year- normalised regionally modified citations per paper a measure of research impact. Call it research quality or something like that.
- Doing something about secondary affiliations. So far this seems to have been a problem mainly for the Highly Cited Researchers indicator in the Shanghai ARWU but it may not be very long before more universities realise that a few million dollars for adjunct faculty could have a disproportion impact on publication and citation counts.
- Also, perhaps THE should consider excluding self-citations (or even citations within the same institution although that would obviously be technically difficult). Self-citation caused a problem in 2010 when Dr El Naschie's diligent citation of himself and a few friends lifted Alexandria University to fourth place in the world for research impact. Something similar might happen again now that THE are using a larger and less selective database.
Friday, August 28, 2015
The Richest University in China ...
... is Tsinghua University but Zhejiang, Peking and Shanghai Jiao Tong Universities appear to be more productive, as measured by the Publications indicator in the Shanghai rankings.
China Daily has just published a list of the top ten universities in China ranked according to annual income as reported to the Ministry of Education. Here they are with the Publications score (papers in the Science Citation Index and the Social Science Citation Index in 2014) in brackets.
1. Tsinghua University 17.56 billion yuan (63.8)
2. Zhejiang University 15.64 billion yuan (68.5)
3. Peking University 12.85 billion yuan (64)
4. Shanghai Jiao Tong University 11.89 billion yuan (68.5)
5. Fudan University 7.71 billion yuan (56.1)
6. Wuhan University 6.83 billion yuan (45.8)
7. Jilin University 6.82 billion yuan (50.7)
8. Huazhong University of Science and Technology 6.81 billion yuan (53.1)
9. Sun Yat-sen University 6.69 billion yuan (54.9)
10. Sichuan University 6.58 billion yuan (54.2).
Tuesday, August 25, 2015
Not fair to call papers freaky
A comment by Pavel Krokovny of Heidelberg University about THE's proposal to exclude papers with 1,000+ authors from their citations indicator in the World University Rankings.
and
"It is true that all 3k+ authors do not draft the paper together, on the contrary, only a small part of them are involved in this very final step of a giant research work leading to a sound result. It is as well true that making the research performed public and disseminating the knowledge obtained is a crucial step of the whole project.
But what you probably missed is that this key stage would not be possible at all without a unique setup which was built and operated by profoundly more physicists and engineers than those who processed raw data and wrote a paper. Without that "hidden part of the iceberg" there would be no results at all. And it would be completely wrong to assume that the authors who did the data analysis and wrote the paper should be given the highest credit in the paper. It is very specific for the experimental HEP field that has gone far beyond the situation that was common still in the first half of 20th century when one scientist or a small group of them might produce some interesting results. The "insignificant" right tail in your distribution of papers on number of coauthors contains the hot part of the modern physics with high impact results topped by the discovery of Higgs-boson. And in your next rankings you are going to dishonour those universities that contributed to this discovery."
and
"the point is that frequent fluctuations of the ranking methodology might damage the credibility of the THE. Certainly, I do not imply here large and well-esteemed universities like Harvard or MIT. I believe their high rankings positions not to be affected by nearly any reasonable changes in the methodology. However, the highest attention to the rankings is attracted from numerous ordinary institutions across the world and their potential applicants and employees. In my opinion, these are the most concerned customers of the THE product. As I already pointed out above, it's very questionable whether participation in large HEP experiments (or genome studies) should be considered "unfair" for those institutions."
Sunday, August 23, 2015
Changes in Ranking Methodology
This year and next the international university rankings appear to be set for more volatility with unusually large upward and downward movement, partly as a result of changes to the methodology for counting citations in the QS and THE rankings.
ARWU
The global ranking season kicked off last week with the publication of the latest edition of the Academic Ranking of World Universities from the ShanghaiRanking Consultancy (SRC), which I hope to discuss in detail in a little while. These rankings are rather dull and boring, which is exactly what they should be. Harvard is, as always, number one for all but one of the indicators. Oxford has slipped from joint ninth to tenth place. Warwick has leaped into the top 100 by virtue of a Fields medal. At the foot of the table there are new contenders from France, Korea and Iran.
Since they began in 2003 the Shanghai rankings have been characterised by a generally stable methodology. In 2012, however, they had to deal with the recruitment of a large and unprecedented number of adjunct faculty by King Abdulaziz University. Previously SRC had simply divided the credit for the Highly Cited Researchers indicator equally between all institutions listed as affiliations. In 2012 and 2013 they wrote to all highly cited researchers with joint affiliations and thus determined the division of credit between primary and secondary affiliations. Then, in 2014 and this year they combined the old Thomson Reuters list, first issued in 2001, and the new one, issued in 2014, and excluded all secondary affiliations in the new list.
The result was that in 2014 the rankings showed an unusual degree of volatility although this year things are a lot more stable. My understanding is that Shanghai will move to counting only the new list next year, again without secondary affiliations, so there should be a lot of interesting changes then. It looks as though Stanford, Princeton, University of Wisconsin -- Madison, and Kyoto University will suffer because of the change while University of California Santa Cruz, Rice University, University of Exeter and University of Wollongong. will benefit.
While SRC has efficiently dealt with the issue of secondary affiliation with regard to its Highly Cited indicator, the issue has now resurfaced in the unusual high scores achieved by King Abdulaziz University for publications largely because of its adjunct faculty. Expect more discussion over the next year or so. It would seem sensible for SRC to think about a five or ten year period rather than one year for their Publications indicator and academic publishers, the media and rankers in general may need to give some thought to the proliferation of secondary affiliations.
QS
On July 27 Quacquarelli Symonds (QS) announced that for 18 months they had been thinking about normalising the counting of citations across five broad subject areas. They observed that a typical institution would receive about half of its citations from the life sciences and medicine, over a quarter from the natural sciences but just 1% from the arts and humanities.
In their forthcoming rankings QS will assign a 20% weighting for citations to each of the five subject areas something, according to Ben Sowter Research Director at QS, that they have been doing for the academic opinion survey.
It would seem then that there are likely to be some big rises and big falls this September. I would guess that places strong in humanities, social sciences and engineering like LSE, New York University and Nanyang Technological University may go up and some of the large US state universities and Russian institutions may go down. That's a guess because it is difficult to tell what happens with the academic and employer surveys.
QS have also made an attempt to deal with the issue of hugely cited papers with hundreds, even thousands of "authors" -- contributors would be a better term -- mainly in physics, medicine and genetics. Their approach is to exclude all papers with more than 10 contributing institutions, that is 0.34% of all publications in the database.
This is rather disappointing. Papers with huge numbers of authors and citations obviously do have distorting effects but they have often dealt with fundamental and important issues. To exclude them altogether is to ignore a very significant body of research.
The obvious solution to the problem of multi-contributor papers is fractional counting, dividing the number of citations by the number of contributors or contributing institutions. QS claim that to do so would discourage collaboration, which does not sound very plausible.
In addition, QS will likely extend the life of survey responses from three to five years. That could make the rankings more stable by smoothing out annual fluctuations in survey responses and reduce the volatility caused by the proposed changes in the counting of citations.
The shift to a moderate version of field normalisation is helpful as it will reduce the undue privilege given to medical research, without falling into the huge problems that result from using too many categories. It is unfortunate, however, that QS have not taken the plunge into fractional counting. One suspects that technical problems and financial considerations might be as significant as the altruistic desire not to discourage collaboration.
After a resorting in September the QS rankings are likely to become a bit more stable and and credible but their most serious problem, the structure, validity and excessive weighting of the academic survey, has still not been addressed.
THE
Meanwhile, Times Higher Education (THE) has also been grappling with the issue of authorship inflation. Phil Baty has announced that this year 649 papers with over 1,000 authors will be excluded from their calculation of citations because " we consider them to be so freakish that they have the potential to distort the global scientific landscape".
But it is not the papers that do the distorting. It is methodology. THE and their former data partners Thomson Reuters, like QS, have avoided fractional counting (except for a small experimental African ranking) and so every one of those hundreds or thousands of authors gets full credit for the hundreds or thousands of citations. This has given places like Tokyo Metropolitan University, Scuola Normale Superiore Pisa, Universite Cadi Ayyad in Morocco and Bogazici University in Turkey remarkably high scores for Citations: Research Impact, much higher than their scores for the bundled research indicators.
THE have decided to simply exclude 649 papers, 0r 0.006% of the total from their calculations for the world rankings. This seems a lot less than QS. Again, this is a rather crude measure. Many of the "freaks" are major contributions to advanced research and deserve to be acknowledged by the rankings in some way.
THE did use fractional counting in their recent experimental ranking of African universities and Baty indicates that they are considering doing so in the future.
It would be a big step forward for THE if they introduce fractional counting of citations. But they should not stop there. There are other bugs in the citations indicator that ought to be fixed.
First, it does not at present measure what it is supposed to measure. It does not measure a university's overall research impact. At best, it is a measure of the average quality of research papers no matter how few (above a certain threshold) they are.
Second, the "regional modification", which divides the university citation impact score by the square root of the the score of the country where the university is located, is another source of distortion. It gives a bonus to universities simply for being located in underperforming countries. THE or TR have justified the modification by suggesting that some universities deserve compensation because they lack funding or networking opportunities. Perhaps they do, but this can still lead to serious anomalies.
Thirdly, THE need to consider whether they should assign citations to so many fields since this increases the distortions that can arise when there is a highly cited paper in a normally lowly cited field.
Fourthly, should they assign a thirty per cent weighting to an indicator that may be useful for distinguishing between the likes of MIT and Caltech but may be of little relevance for the universities that are now signing up for the world rankings?
ARWU
The global ranking season kicked off last week with the publication of the latest edition of the Academic Ranking of World Universities from the ShanghaiRanking Consultancy (SRC), which I hope to discuss in detail in a little while. These rankings are rather dull and boring, which is exactly what they should be. Harvard is, as always, number one for all but one of the indicators. Oxford has slipped from joint ninth to tenth place. Warwick has leaped into the top 100 by virtue of a Fields medal. At the foot of the table there are new contenders from France, Korea and Iran.
Since they began in 2003 the Shanghai rankings have been characterised by a generally stable methodology. In 2012, however, they had to deal with the recruitment of a large and unprecedented number of adjunct faculty by King Abdulaziz University. Previously SRC had simply divided the credit for the Highly Cited Researchers indicator equally between all institutions listed as affiliations. In 2012 and 2013 they wrote to all highly cited researchers with joint affiliations and thus determined the division of credit between primary and secondary affiliations. Then, in 2014 and this year they combined the old Thomson Reuters list, first issued in 2001, and the new one, issued in 2014, and excluded all secondary affiliations in the new list.
The result was that in 2014 the rankings showed an unusual degree of volatility although this year things are a lot more stable. My understanding is that Shanghai will move to counting only the new list next year, again without secondary affiliations, so there should be a lot of interesting changes then. It looks as though Stanford, Princeton, University of Wisconsin -- Madison, and Kyoto University will suffer because of the change while University of California Santa Cruz, Rice University, University of Exeter and University of Wollongong. will benefit.
While SRC has efficiently dealt with the issue of secondary affiliation with regard to its Highly Cited indicator, the issue has now resurfaced in the unusual high scores achieved by King Abdulaziz University for publications largely because of its adjunct faculty. Expect more discussion over the next year or so. It would seem sensible for SRC to think about a five or ten year period rather than one year for their Publications indicator and academic publishers, the media and rankers in general may need to give some thought to the proliferation of secondary affiliations.
QS
On July 27 Quacquarelli Symonds (QS) announced that for 18 months they had been thinking about normalising the counting of citations across five broad subject areas. They observed that a typical institution would receive about half of its citations from the life sciences and medicine, over a quarter from the natural sciences but just 1% from the arts and humanities.
In their forthcoming rankings QS will assign a 20% weighting for citations to each of the five subject areas something, according to Ben Sowter Research Director at QS, that they have been doing for the academic opinion survey.
It would seem then that there are likely to be some big rises and big falls this September. I would guess that places strong in humanities, social sciences and engineering like LSE, New York University and Nanyang Technological University may go up and some of the large US state universities and Russian institutions may go down. That's a guess because it is difficult to tell what happens with the academic and employer surveys.
QS have also made an attempt to deal with the issue of hugely cited papers with hundreds, even thousands of "authors" -- contributors would be a better term -- mainly in physics, medicine and genetics. Their approach is to exclude all papers with more than 10 contributing institutions, that is 0.34% of all publications in the database.
This is rather disappointing. Papers with huge numbers of authors and citations obviously do have distorting effects but they have often dealt with fundamental and important issues. To exclude them altogether is to ignore a very significant body of research.
The obvious solution to the problem of multi-contributor papers is fractional counting, dividing the number of citations by the number of contributors or contributing institutions. QS claim that to do so would discourage collaboration, which does not sound very plausible.
In addition, QS will likely extend the life of survey responses from three to five years. That could make the rankings more stable by smoothing out annual fluctuations in survey responses and reduce the volatility caused by the proposed changes in the counting of citations.
The shift to a moderate version of field normalisation is helpful as it will reduce the undue privilege given to medical research, without falling into the huge problems that result from using too many categories. It is unfortunate, however, that QS have not taken the plunge into fractional counting. One suspects that technical problems and financial considerations might be as significant as the altruistic desire not to discourage collaboration.
After a resorting in September the QS rankings are likely to become a bit more stable and and credible but their most serious problem, the structure, validity and excessive weighting of the academic survey, has still not been addressed.
THE
Meanwhile, Times Higher Education (THE) has also been grappling with the issue of authorship inflation. Phil Baty has announced that this year 649 papers with over 1,000 authors will be excluded from their calculation of citations because " we consider them to be so freakish that they have the potential to distort the global scientific landscape".
But it is not the papers that do the distorting. It is methodology. THE and their former data partners Thomson Reuters, like QS, have avoided fractional counting (except for a small experimental African ranking) and so every one of those hundreds or thousands of authors gets full credit for the hundreds or thousands of citations. This has given places like Tokyo Metropolitan University, Scuola Normale Superiore Pisa, Universite Cadi Ayyad in Morocco and Bogazici University in Turkey remarkably high scores for Citations: Research Impact, much higher than their scores for the bundled research indicators.
THE have decided to simply exclude 649 papers, 0r 0.006% of the total from their calculations for the world rankings. This seems a lot less than QS. Again, this is a rather crude measure. Many of the "freaks" are major contributions to advanced research and deserve to be acknowledged by the rankings in some way.
THE did use fractional counting in their recent experimental ranking of African universities and Baty indicates that they are considering doing so in the future.
It would be a big step forward for THE if they introduce fractional counting of citations. But they should not stop there. There are other bugs in the citations indicator that ought to be fixed.
First, it does not at present measure what it is supposed to measure. It does not measure a university's overall research impact. At best, it is a measure of the average quality of research papers no matter how few (above a certain threshold) they are.
Second, the "regional modification", which divides the university citation impact score by the square root of the the score of the country where the university is located, is another source of distortion. It gives a bonus to universities simply for being located in underperforming countries. THE or TR have justified the modification by suggesting that some universities deserve compensation because they lack funding or networking opportunities. Perhaps they do, but this can still lead to serious anomalies.
Thirdly, THE need to consider whether they should assign citations to so many fields since this increases the distortions that can arise when there is a highly cited paper in a normally lowly cited field.
Fourthly, should they assign a thirty per cent weighting to an indicator that may be useful for distinguishing between the likes of MIT and Caltech but may be of little relevance for the universities that are now signing up for the world rankings?
Sunday, August 16, 2015
Research Ranking of African Universities
I suppose it should no longer be surprising that university heads flock to Johannesburg for the unveiling of an "experimental" research ranking of 30 African universities that put the University of Port Harcourt in 6th place, did not include Cairo University, the University of Ibadan, Ahmadu Bello University or the University of Nigeria Nsukka, placed Makerere above Stellenbosch and Universite Cadi Ayyad above the University of Pretoria.
It is still a bit odd that African universities seem to have ignored a reasonable and sensible research ranking from the Journals Consortium that I found while reading an article by Gerald Ouma in the Mail and Guardian Africa, which, by the way, had an advertisement about Universite Cadi Ayyad being number ten in the THE African ranking.
The Journals Consortium ranking is based on publications, citations and web visibility and altogether 1,447 institutions are ranked. The methodology, which is a bit thin, is here.
Here are the top ten.
1. University of Cape Town
2. Cairo University
3. University of Pretoria
4. University of Nairobi
5. University of South Africa
6. University of the Witwatersrand
7. Stellenbosch University
8. University of Ibadan
9. University of Kwazulu-Natal
10. Ain Shams University
The University of Port Harcourt is 36th and Universite Cadi Ayyad is 89th.
I am pleased to note that two of my former employers are in the rankings, University of Maiduguri in 66th place and Umar ibn Ibrahim El-Kanemi College of Education, Science and Technology (formerly Borno College of Basic Studies) in 988th.
It is still a bit odd that African universities seem to have ignored a reasonable and sensible research ranking from the Journals Consortium that I found while reading an article by Gerald Ouma in the Mail and Guardian Africa, which, by the way, had an advertisement about Universite Cadi Ayyad being number ten in the THE African ranking.
The Journals Consortium ranking is based on publications, citations and web visibility and altogether 1,447 institutions are ranked. The methodology, which is a bit thin, is here.
Here are the top ten.
1. University of Cape Town
2. Cairo University
3. University of Pretoria
4. University of Nairobi
5. University of South Africa
6. University of the Witwatersrand
7. Stellenbosch University
8. University of Ibadan
9. University of Kwazulu-Natal
10. Ain Shams University
The University of Port Harcourt is 36th and Universite Cadi Ayyad is 89th.
I am pleased to note that two of my former employers are in the rankings, University of Maiduguri in 66th place and Umar ibn Ibrahim El-Kanemi College of Education, Science and Technology (formerly Borno College of Basic Studies) in 988th.
Friday, August 14, 2015
This is also really frightening
From The Times, which is supposed to be a really posh paper -- I remember adverts "Top People Read the Times" -- read by people with degrees from Russell Group universities:
"Of the 3 million Muslims in Britain, about 2.3 million identify as Sunni, compared with 300,000 Shias, or 5 per cent of the total. Most British Shias have roots in Iran, Iraq, Azerbaijan or Bahrain. Sunnis make up the vast majority of Muslims worldwide."
"Of the 3 million Muslims in Britain, about 2.3 million identify as Sunni, compared with 300,000 Shias, or 5 per cent of the total. Most British Shias have roots in Iran, Iraq, Azerbaijan or Bahrain. Sunnis make up the vast majority of Muslims worldwide."
Thursday, August 13, 2015
This is really frightening
The evidence that human intelligence is falling continues to accumulate. PISA scores in Sweden are down and not just among immigrants. The intelligence of US marines, as measured by the General Classification Test, has been in decline since the 1980s. Based on "a small, probably representative sample" the French national IQ has dropped since 1999.
And now we have this from an article about the possible revival of the Liberal Democrats by Jon Stone, who is a reporter, in the Independent, which is a newspaper.
"Lazarus is a character in the Christian holy book The Bible who comes back from the dead after an intervention by Jesus Christ, a religious figure."
I thought the Independent was one of the posh papers read by bright people who had degrees and knew how ignorant and illiterate UKIP supporters were.
Does Jon Stone really think he has to explain to his readers what the Bible is? Or is this some sort of PC policy?
The worse thing is that he apparently doesn't know that Lazarus was really a character in a Robert Heinlein novel.
And now we have this from an article about the possible revival of the Liberal Democrats by Jon Stone, who is a reporter, in the Independent, which is a newspaper.
"Lazarus is a character in the Christian holy book The Bible who comes back from the dead after an intervention by Jesus Christ, a religious figure."
Does Jon Stone really think he has to explain to his readers what the Bible is? Or is this some sort of PC policy?
The worse thing is that he apparently doesn't know that Lazarus was really a character in a Robert Heinlein novel.
Wednesday, August 12, 2015
The Plague of Authorship Inflation
An article in the Wall Street Journal by Robert Lee Hotz describes the apparently inexorable increase in the number of authors of scientific papers.
In 2014 according to the Web of Science the number of papers with 50 or more authors reached over 1400 and the number with 500 or more was over 200. The situation is getting so bad that one journal, Nature, was unable to list all the autors of a paper in the print edition .
Hotz has an amusing digression where he recounts how scientists have listed a hamster, a dog and a computer as co-authors
One issue that he does not explore is the way in which multi-authorship has distorted global university rankings. Times Higher Education and Thomson Reuters until this year declined to use fractional counting of citations in their World University Rankings so that every one of hundreds of contributors was credited with every one of thousands of citations. When this was combined with normalisation by 250 fields so that a few citations could have a disproportionate effect and a deceptive regional modification that rewarded universities for being in a country that produced few citations then the results could be ludicrous. Unproductive institutions, for example Alexandria University, those that are very small, for example Scuala Normale Superiore Pisa, or very specialised, for example Moscow State Engineering Physics Institute, have been presented by THE as world leaders for research impact.
Let us hope that this indicator is reformed in the forthcoming world rankings.
In 2014 according to the Web of Science the number of papers with 50 or more authors reached over 1400 and the number with 500 or more was over 200. The situation is getting so bad that one journal, Nature, was unable to list all the autors of a paper in the print edition .
Hotz has an amusing digression where he recounts how scientists have listed a hamster, a dog and a computer as co-authors
One issue that he does not explore is the way in which multi-authorship has distorted global university rankings. Times Higher Education and Thomson Reuters until this year declined to use fractional counting of citations in their World University Rankings so that every one of hundreds of contributors was credited with every one of thousands of citations. When this was combined with normalisation by 250 fields so that a few citations could have a disproportionate effect and a deceptive regional modification that rewarded universities for being in a country that produced few citations then the results could be ludicrous. Unproductive institutions, for example Alexandria University, those that are very small, for example Scuala Normale Superiore Pisa, or very specialised, for example Moscow State Engineering Physics Institute, have been presented by THE as world leaders for research impact.
Let us hope that this indicator is reformed in the forthcoming world rankings.
Sunday, August 09, 2015
Another Ranking Indicator for Africa
The prestigious and exclusive THE African summit is over. Whether it will lead to a serious regional ranking remains to be seen. The indicators used by THE in their world rankings and various regional spin-offs seem generally inappropriate to all but about two dozen institutions: reputation for research, income in three different indicators, citations, number of doctoral students.
But there is still a need to compare and evaluate the effectiveness of African universities in providing instruction in academic, technical and professional subjects and perhaps in their participation in innovative and economically beneficial projects.
Probably the way ahead for African ranking is the use of social media, bypassing the very problematical collection of institutional data. More of that later.
Anyway, here is a ranking of African universities according to the number of results from a search of the WIPO Patentscope site. Searching was done on the 5th and 6th of August. Universities included the top 50 African universities in Webometrics and any university in the recent THE pilot ranking. All fields were searched.
There are no real surprises. South Africa is dominant, followed by Egypt. The flagships of Uganda, Kenya, Ghana and Nigeria are represented. Most universities in Africa do no innovative research reflected in patents.
Rank | University | Country | References in patents any field |
---|---|---|---|
1 | University of Cape Town | South Africa | 377 |
2 | University of Pretoria | South Africa | 242 |
3 | University of the Witwatersrand | South Africa | 217 |
4 | Stellenbosch University | South Africa | 165 |
5 | North West University | South Africa | 125 |
6 | Cairo University | Egypt | 100 |
7 | University of the Free State | South Africa | 72 |
8 | University of Johannesburg | South Africa | 46 |
9 | University of Kwazulu-Natal | South Africa | 41 |
10 | Nelson Mandela Metropolitan University | South Africa | 34 |
11 | Assiut University | Egypt | 31 |
12 | Rhodes University | South Africa | 30 |
13 | University of Nairobi | Kenya | 21 |
14 | Makerere University | Uganda | 20 |
15 | University of the Western Cape | South Africa | 18 |
16 | American University in Cairo | Egypt | 17 |
17 | University of Ghana | Ghana |
13
|
18 | UniversitƩ Mohammed V Souissi | Morocco | 12 |
19 | Cape Peninsula University of Technology | South Africa | 11 |
20 | Mansoura University | Egypt | 10 |
21 | University of Namibia | Namibia | 9 |
22 | Alexandria University | Egypt | 8 |
23 | University of Ibadan | Nigeria |
7
|
24= | Kenyatta University | Kenya | 6 |
24= | University of Zimbabwe | Zimbabwe | 6 |
24= | Durban University of Technology | South Africa | 6 |
27= | University of South Africa | South Africa | 5 |
27= | Zagazig University | Egypt | 5 |
27= | Suez Canal University | Egypt | 5 |
30= | University of Dar Es Salaam | Tanzania | 4 |
30= | Addis Ababa University | Ethiopia | 4 |
32= | University of Ilorin | Nigeria | 3 |
32= | University of Khartoum | Sudan | 3 |
32= | University of Malawi | Malawi | 3 |
35= | Helwan University | Egypt | 2 |
35= | UniversitƩ Hassan II Ain Chock | Morocco | 2 |
35= | UniversitƩ Cadi Ayyad Marrake | Morocco | 2 |
35= | Kafrelsheikh University | Egypt | 2 |
35= | University of Zambia | Zambia | 2 |
35= | Ahmadu Bello University | Nigeria | 2 |
41= | University of Lagos | Nigeria |
1
|
41= | UniversitƩ Cheikh Anta Diop | Senegal | 1 |
41= | University of Mauritius | Mauritius | 1 |
41= | UniversitƩ de Constantine 1 | Algeria | 1 |
41= | UniversitƩ de Yaounde 1 | Cameroons | 1 |
46= | Obafemi Awolowo University | Nigeria | 0 |
46= | Kwame Nkrumah University of Science and Technology | Ghana | 0 |
46= | University of Port Harcourt | Nigeria | 0 |
46= | University of Botswana | Botswana | 0 |
46= | Tanta University | Egypt | 0 |
46= | Kwame Nkrumah University of Science and Technology | Ghana | 0 |
46= | University of Port Harcourt | Nigeria | 0 |
46= | Covenant University | Nigeria | 0 |
46= | Bejaia university | Algeria | 0 |
46= | University of Botswana | Botswana | 0 |
46= | Minia University | Egypt | 0 |
46= | University of Tunis | Tunisia | 0 |
46= | Benha University | Algeria | 0 |
46= | Universidade CatĆ³lica de Angola | Angola | 0 |
46= | UniversitƩ de LomƩ | Togo | 0 |
46= | South Valley University | Egypt | 0 |
46= | UniversitƩ Abou Bekr Belkaid | Algeria | 0 |
46= | Beni-Suef university | Egypt | 0 |
46= | UniversitƩ Omar Bongo | Gabon | 0 |
46= | University of The Gambia | Gambia | 0 |
46= | UniversitƩ de Toliara | Madagascar | 0 |
46= | UniversitƩ Kasdi Merbah Ouargia | Algeria | 0 |
46= | Universite de la Reunion | Reunion | 0 |
46= | Universidade Eduardo Mondlane | Mozambique | 0 |
46= | UniversitƩ de Ouagadougou | Burkina Faso | 0 |
46= | University of Rwanda | Rwanda | 0 |
46= | Universite de Bamako | Mali | 0 |
46= | University of Swaziland | Swaziland | 0 |
46= | UniversitƩ FƩlix Houphouƫt-Boigny | Ivory Coast |
0
|
46= | UniversitƩ de Kinshasa | Democratic republic of the Congo | 0 |
46= | National University of Lesotho | Lesotho | 0 |
46= | Universidade Jean Piaget de Cabo Verde | Cape Verde |
0
|
46= | National Engineering School of Sfax | Tunisia | 0 |
46= | UniversitƩ Marien Ngouabi | Congo republic | 0 |
46= | University of Liberia | Liberia | 0 |
46= | UniversitƩ Djillali Liabes | Algeria | 0 |
46= | UniversitƩ Abdou Moumouni de Niamey | Niger | 0 |
46= | Misurata University | Egypt | 0 |
46= | UniversitƩ de Dschang | Cameroons | 0 |
46= | UniversitƩ de Bangui | Central African Republic | 0 |
46= | UniversitƩ de Nouakchott | Mauretania | 0 |
46= | Eritrea Iinstitute of Technology | Eritrea | 0 |
46= | UniversitƩ de Djibouti | Djibout | 0 |
46= | University of Seychelles | Seychelles | 0 |
46= | Mogadishu University | Somalia | 0 |
46= | Universidad Nacional de Guinea Ecuatorial | Equatorial Guinea | 0 |
46= | Universite Gamal Abdel Nasser de Conakry | Guinea | 0 |
46= | University of Makeni | Sierria Leone | 0 |
46= | John Garang Memorial University | South Sudan | 0 |
46= | Hope African University | Burundi | 0 |
46= | Universite de Moundou | Chad | 0 |
The Onion Analyses the US News Rankings
Just an extract. The whole thing is here.
- Step 1: Schools are weighed on a scale
- Step 2: Researchers calculate each campus’ student-to-student ratio
- Step 3: Any college whose colors are maroon and gold is immediately eliminated
Friday, August 07, 2015
Error announcement from CWTS Leiden Ranking
See here for an error announcement from CWTS Leiden Ranking.
The prompt disclosure of the error adds to the credibility of the rankings.
Subscribe to:
Posts (Atom)