The relentless grade inflation in British secondary and tertiary education has been well documented. A first or upper second class degree or a grade A at A level no longer means very much. It has been far too easy for universities to cover up their deficiencies or attract applications by handing out firsts or upper seconds like smarties.
Now the consequences are beginning to become apparent. Ernst and Young (EY), the accounting firm, will no longer require applicants to have an upper second or three grade Bs. Instead they will use "numerical tests" and "strength assessments" to assess applicants.
I suspect that in a little while EY will come under fire for not recruiting enough of those groups that do not test well, especially in quantitative skills. They and others will probably join in the hunt for the Holy Grail of modern social science, the factor that is non-cognitive but still significantly predictive of career success.
Meanwhile, universities will try to find new functions to replace their historical role as the guarantor of a certain level of cognitive ability. Expect to see more conversations about assessing civic engagement or reaching out to communities.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Tuesday, August 04, 2015
Monday, August 03, 2015
The CWUR Rankings 2015
The Center for World University Rankings, based in Jeddah, Saudi Arabia, has produced the latest edition of its global ranking of 1,000 universities. The Center is headed by Nadim Mahassen, an Assistant Professor at King Abdulaziz University.
The rankings include five indicators that measure various aspects of publication and research: publications in "reputable journals", research papers in "highly influential" journals, citations, h-index and patents.
These indicators are given a combined weighting of 25%.
Another 25% goes to Quality of Education, which is measured by the number of alumni receiving major international awards relative to size (current number of students according to national agencies). This is obviously a crude measure which fails to distinguish among the great mass of universities that have never won an award.
The rankings include five indicators that measure various aspects of publication and research: publications in "reputable journals", research papers in "highly influential" journals, citations, h-index and patents.
These indicators are given a combined weighting of 25%.
Another 25% goes to Quality of Education, which is measured by the number of alumni receiving major international awards relative to size (current number of students according to national agencies). This is obviously a crude measure which fails to distinguish among the great mass of universities that have never won an award.
Similarly, another quarter is assigned to Quality of Faculty measured by the number of faculty receiving such awards and another quarter to Alumni Employment measured by the number of CEOs of top corporations. Again, these indicators are of little or no relevance to all but a few hundred institutions.
Alumni employment gets another 25%. This is measured by alumni holding CEO positions in top companies. Again, this would be of relevance to a limited number of universities.
1. Harvard
2. Stanford
3. MIT
4. Cambridge
5. Oxford
6. Columbia
7. Berkeley
8. Chicago
9. Princeton
10. Cornell.
The only change from last year is that Cornell has replaced Yale in tenth place.
Countries with Universities in the Top Hundred in 2015 and 2014
Countries with Universities in the Top Hundred in 2015 and 2014
Country | Universities in top 100 2015 | 2014 |
---|---|---|
US | 55 | 53 |
UK | 7 | 7 |
Japan | 7 | 8 |
Switzerland | 4 | 4 |
France | 4 | 4 |
Canada | 3 | 3 |
Israel | 3 | 3 |
South Korea | 2 | 1 |
Germany | 2 | 4 |
Australia | 2 | 2 |
China | 2 | 2 |
Netherlands | 2 | 1 |
Russia | 1 | 1 |
Taiwan | 1 | 1 |
Belgium | 1 | 1 |
Norway | 1 | 0 |
Sweden | 1 | 2 |
Singapore | 1 | 1 |
Denmark | 1 | 1 |
Italy | 0 | 1 |
Top Ranked in Region or Country
USA: Harvard
Canada: Toronto
Asia: Tokyo
South Asia: IIT Delhi
Southeast Asia : National University of Singapore
Europe: Cambridge
Central and Eastern Europe: Lomonosov Moscow State University
Arab World: King Saud University
Middle East: Hebrew University of Jerusalem
Latin America: Sao Paulo
Africa: University of the Witwatersrand
USA: Harvard
Canada: Toronto
Asia: Tokyo
South Asia: IIT Delhi
Southeast Asia : National University of Singapore
Europe: Cambridge
Central and Eastern Europe: Lomonosov Moscow State University
Arab World: King Saud University
Middle East: Hebrew University of Jerusalem
Latin America: Sao Paulo
Africa: University of the Witwatersrand
Carribbean: University of Puerto Rico at Mayagüez
Noise Index
In the top 20, the CWUR rankings are more stable than THE and QS but less stable than the Shanghai rankings.
Average position change of universities in the top 20 in 2014: 0.5
Comparison
CWUR 2013-14: 0.9
Shanghai Rankings (ARWU)
2011-12: 0.15
2012-13: 0.25
THE WUR 2012-13: 1.2
QS WUR 2012-13 1.7
With regard to the top 100, the CWUR rankings are more stable this year, with a volatility similar to the QS and THE rankings although significantly less so than ARWU.
Average position change of universities in the top 100 in 2014: 4.15
Comparison
CWUR 2013-14: 10.59
Shanghai Rankings (ARWU
2011-12: 2.01
2012-13: 1.66
THE WUR 2012-13: 5.36
QS WUR 2012-13: - 3.97
Noise Index
In the top 20, the CWUR rankings are more stable than THE and QS but less stable than the Shanghai rankings.
Average position change of universities in the top 20 in 2014: 0.5
Comparison
CWUR 2013-14: 0.9
Shanghai Rankings (ARWU)
2011-12: 0.15
2012-13: 0.25
THE WUR 2012-13: 1.2
QS WUR 2012-13 1.7
With regard to the top 100, the CWUR rankings are more stable this year, with a volatility similar to the QS and THE rankings although significantly less so than ARWU.
Average position change of universities in the top 100 in 2014: 4.15
Comparison
CWUR 2013-14: 10.59
Shanghai Rankings (ARWU
2011-12: 2.01
2012-13: 1.66
THE WUR 2012-13: 5.36
QS WUR 2012-13: - 3.97
Saturday, August 01, 2015
The Other THE African University Rankings
THE have just presented Africa and the world with a list of 30 African universities ranked according to "research impact", that is the number of citations per paper normalised by field (300 of them?) and year. Citations are not just counted but compared with the world average for specific years and fields.
The result is that a university that manages to join a large international project, typically in medicine, genetics or particle physics with a disproportionate number of citations especially in the first couple of years of publication, can get an extremely high score. If the university had few publications to begin with the score for this indicator would be even higher.
For these rankings THE have introduced fractionalised counting. so that a university that is one of 100 contributors to a project with 2000 citations would get the equivalent of 20 citations. Under the procedure THE and the their former data collectors Thomson Reuters had been using for the world university rankings it would have been credited with 2000 citations as would all the other contributors.
THE are to be congratulated for finally using fractionalised counting which has reduced the likelihood of the indicator producing very odd results. Even so, the snapshot ranking is inappropriate for African universities, as it still privileges those that happen to contribute to to a few international projects.
The results might seem acceptable to THE and its international audiencebut I suspect that Egyptian academics will be amused by a ranking that includes six universities but not Cairo University . I wonder how many Nigerians will accept a ranking that includes Port Harcourt but not Ibadan or Ahmadu Bello.
Along with standardised scores for citations THE has also included the number of publications in the Scopus database between 2009 and 2013. This, a measurement of research output of a fairly high quality, is probably more relevant to Africa than the citations indicator. Unfortunately, it shows the very limited amount of research done between the Sahara and the Kalahari and so would be inexpedient to present as a snapshot of what a future ranking might look like.
The methods, approaches and assumptions of THE's world rankings with their emphasis on inputs, especially income, research quality, inappropriately called research impact or research influence, reputation, and doctoral education are of limited value to all but a few African universities and stakeholders. Whether anything of value comes from the conversation in Johannesburg remains to be seen but it is unlikely that a modified version of the world rankings will be of much value to anyone.
Anyway, below are the 30 African universities reordered according to number of publications.
Rank | University | Country | number of publications 2009-2013 |
---|---|---|---|
1 | University of Cape Town | South Africa | 5540.21 |
2 | University of Pretoria | South Africa | 4544.33 |
3 | University of the Witwatersrand | South Africa | 4387.17 |
4 | Stellenbosch University | South Africa | 4357.33 |
5 | University of Kwazulu-Natal | South Africa | 4235.09 |
6 | Alexandria University | Egypt | 2550.15 |
7 | Universite de Sfax | Tunisia | 2355.30 |
8 | University of Johannesburg | South Africa | 2192.74 |
9 | North West University | South Africa | 1707.94 |
10 | Assiut University | Egypt | 1588.64 |
11 | University of the Free State | South Africa | 1512.56 |
12 | Université Mohammed V – Agdal | Morocco | 1503.69 |
13 | Rhodes University | South Africa | 1296.96 |
14 | University of the Western Cape | South Africa | 1154.77 |
15 | Makerere University | Uganda | 1112.69 |
16 | Suez Canal University | Egypt | 998.98 |
17 | University of South Africa | South Africa | 981.67 |
18 | Nelson Mandela Metropolitan University | South Africa | 885.77 |
19 | Universite Hassan II Casablanca | Morocco | 960.25 |
20 | Universite Cadi Ayyad | Morocco | 910.82 |
21 | Addis Ababa University | Ethiopia | 893.90 |
22 | Univerite de Tunis | Tunisia | 879.63 |
23 | Universite de Yaounde I | Cameroons | 876.33 |
24 | Ecole Nationale d’Ingénieurs de Sfax | Tunisia | 822.31 |
25 | University of Ghana | Ghana | 804.53 |
26 | American University in Cairo | Egypt | 700.89 |
27 | Minia University | Egypt | 694.79 |
28 | University of Nairobi | Kenya | 671.72 |
29 | South Valley University | Egypt | 636.85 |
30 | University of Port Harcourt | Nigeria | 573.55 |
Wednesday, July 29, 2015
Google Scholar Ranking of African Universities
As competition in the ranking world intensifies, Times Higher Education (THE) and Quacquarelli Symonds are diligently promoting their various regional ranking, data processing and event management projects. The latest is THE's African summit at the University of Johannesburg.
Three weeks ago THE issued what they described as an "experimental and preliminary" ranking which consisted of 15 universities ordered according to the number of citations per paper normalised for field and year. An interesting innovation was that citations were fractionalised so that participants in large collaborative projects would be credited in proportion to their fraction of the total contributors .
This is just one indicator and it is not really a measure of research influence, but rather of research quality and it is still skewed by participation in multi-contributor papers in medicine and particle physics. It is unlikely that the University of Port Harcourt or Universite Cadi Ayyad would be in the top ten of any other indicator.
THE have indicated that they will add another 15 names to the list at the Johannesburg summit.
The table below was compiled for the purpose of checking on the claims of THE or other rankers that might attempt to evaluate African universities. It simply counts the number of results (2012-2014: exclude citations and patents) from a query to Google Scholar. Data was compiled on the 25th and 26th of July. The criteria for inclusion were being in the top 50 of the Webometrics rankings or the 15 universities in the THE list. The top university in any country not included was added from either the Webometrics or ic4u rankings.
This database includes papers, reports, theses and dissertations, conference proceedings and so on. It is certainly not a measure of research quality but rather of the volume of any activities connected with research. In the case of the two Kenyan universities it probably reflects the size and inclusiveness of the university repositories.
One thing about the Google scholar list is that it confirms suspicions that the quality of Egyptian universities has been underestimated by the big name rankers. For further evidence one might look at data from social media such as LinkedIn or just contrast the aspirations of Egyptian students in the revolutions of 2011 and 2013 compared with those of students at the University of Cape Town and Durban University of Technology.
Rank | University | Country | Google Scholar Results |
---|---|---|---|
1 | University of Cape Town | South Africa | 17,000 |
2 | Cairo University | Egypt | 16,800 |
3 | University of Pretoria | South Africa | 16,500 |
4 | University of Nairobi | Kenya | 16,400 |
5 | University of the Witwatersrand | South Africa | 15,800 |
6 | University of Kwazulu-Natal | South Africa | 15,500 |
7 | Stellenbosch University | South Africa | 14,900 |
8 | University of Ibadan | Nigeria | 14,800 |
9 | University of South Africa | South Africa | 13,500 |
10 | Kenyatta University | Kenya | 12,000 |
11 | University of Johannesburg | South Africa | 11,200 |
12 | Makerere University | Uganda | 10,400 |
13 | North West University | South Africa | 10,100 |
14 | University of Ghana | Ghana | 8,330 |
15 | Alexandria University | Egypt | 7,610 |
16 | University of Lagos | Nigeria | 7,220 |
17 | Rhodes University | South Africa | 7,210 |
18 | University of the Western Cape | South Africa | 6,870 |
19 | Obafemi Awolowo University | Nigeria | 6,800 |
20 | Mansoura University | Egypt | 6,480 |
21 | University of the Free State | South Africa | 6,400 |
22 | Addis Ababa University | Ethiopia | 6,210 |
23 | Zagazig University | Egypt | 6,160 |
24 | American University in Cairo | Egypt | 5,770 |
25 | University of Ilorin | Nigeria | 5,620 |
26 | Assiut University | Egypt | 5,580 |
27 | Kwame Nkrumah University of Science and Technology | Ghana | 5,080 |
28= | University of Zimbabwe | Zimbabwe | 4,830 |
28= | University of Port Harcourt | Nigeria | 4,830 |
30 | University of Botswana | Botswana | 4,260 |
31 | University of Zambia | Zambia | 4,240 |
32 | University of Dar Es Salaam | Tanzania | 4,120 |
33 | University of Khartoum | Sudan | 4,110 |
34 | Suez Canal University | Egypt | 3,670 |
35 | Tanta University | Egypt | 3,600 |
36 | Jomo Kenyatta University of Agriculture and Technology | Kenya | 3,520 |
37 | Nelson Mandela Metropolitan University | South Africa | 3,490 |
38 | Covenant University Ota | Nigeria | 2,950 |
39 | Helwan University | Egypt | 2,940 |
40 | Benha University | Egypt | 2,570 |
41 | Minia University | Egypt | 2,390 |
42 | University of Malawi | Malawi | 2,340 |
43 | Université Abou Bekr Belkaid | Algeria | 2,290 |
44 | Universityof Tunis | Tunisia | 2,270 |
45= | Université Kasdi Merbah Ouargla | Algeria | 2,240 |
45= | Cape Peninsula University of Technology | South Africa | 2,240 |
47 | Université Cheikh Anta Diop de Dakar | Senegal | 1,950 |
48 | University of Namibia | Namibia | 1,760 |
49 | Universite de la Reunion | Reunion | 1,690 |
50 | Durban University of Technology | South Africa | 1,560 |
51 | University of Mauritius | Mauritius | 1,490 |
52 | Université d'Abomey-Calavi | Benin | 1,460 |
53 | South Valley University | Egypt | 1,440 |
54 | Universidade Eduardo Mondlane | Mozambique | 1,420 |
55 | Beni-Suef University | Egypt | 1,400 |
56 | Université Cadi Ayyad Marrakech | Morroco | 1,370 |
57 | Université de Ouagadougou | Burkina Faso | 1,300 |
58 | University of Rwanda | Rwanda | 1,270 |
59 | Université des Sciences et de la Technologie Houari Boumediene | Algeria | 976 |
60 | Université de Lomé | Togo | 784 |
61 | Université de Bamako | Mali | 660 |
62 | Kafrelsheikh University | Egypt | 618 |
63 | University of Swaziland | Swaziland | 615 |
64 | Université Félix Houphouët-Boigny | Ivory Coast | 590 |
65 | Université de Kinshasa | Domocratic Republic of the Congo | 558 |
66 | National University of Lesotho | Lesotho | 555 |
67 | Université Constantine 1 | Algeria | 468 |
68 | Bejaia University | Algeria | 413 |
69 | Universidade Jean Piaget de Cabo Verde | Cape Verde | 407 |
70 | Université Mohammed V Souissi | Tunisia | 361 |
71 | National Engineering School of Sfax | Tunisia | 271 |
72 | Université Marien Ngouabi | Republc of Congo | 256 |
73 | University of Liberia | Liberia | 255 |
74 | Université Djillali Liabes | Algeria | 243 |
75 | Université Abdou Moumouni de Niamey | Niger | 206 |
76 | Misurata University | Libya | 155 |
77 | Université Omar Bongo | Gabon | 138 |
78 | University of The Gambia | Gambia | 130 |
79 | Universidade Católica de Angola | Angola | 115 |
80= | Université de Dschang | Cameroons | 113 |
80= | Université de Bangui | Central African Empire | 113 |
82 | Université de Nouakchott | Mauretania | 108 |
83 | Eritrea Institute of Technology | Eritrea | 76 |
84 | Université de Djibouti | Djibouti | 66 |
85 | Université de Toliara | Madagascar | 59 |
86 | Université Hassan II Ain Chock | Morocco | 55 |
87 | University of Seychelles | Seychelles | 52 |
88 | Mogadishu University | Somalia | 51 |
89 | Universidad Nacional de Guinea Ecuatorial | Equatorial Guinea | 40 |
90 | Universite Gamal Abdel Nasser de Conakry | Guinea | 21 |
91 | University of Makeni | Sierra Leone | 18 |
92 | John Garang Memorial University | South Sudan | 12 |
93 | Hope Africa University | Burundi | 3 |
94 | Universite de Moundou | Chad | 2 |
Tuesday, July 28, 2015
Would anyone notice if a small, old but awfully clever dog filled in a university ranking survey, and would it make a difference?
The Australian newspaper The Age has a piece by Erica Cervini on how she allowed her dog to complete the QS academic reputation survey on the quality of veterinary schools.
She doesn't elaborate on how the dog chose the schools. Was it by barking or tail wagging when shown pictures of the buildings?
Seriously though, she does have a point. Can QS stop people signing up just to support their employer or outvote their rivals?
To be fair, QS are aware that their surveys might be manipulated and have taken steps over the years to prevent this by such means as forbidding respondents from voting for their declared employer or repeat voting from the same computer. Even so, it seems that some universities, especially in Latin America, are getting scores in the reputation surveys that appear too high,especially when compared with their overall scores. In the employer survey the Pontifical Catholic University of Chile is 56th (overall 167) and the University of Buenos Aires 49th (overall 198). In the academic survey the University of Sao Paulo is 54th (overall 132 and the National Autonomous University of Mexico 55th (overall 175).
QS are apparently considering reforming their citations per faculty indicator and allowing unchanged responses to the surveys to be recycled for five instead of three years. This is welcome but a more rigorous overhaul of the reputation indicators is sorely needed.
She doesn't elaborate on how the dog chose the schools. Was it by barking or tail wagging when shown pictures of the buildings?
Seriously though, she does have a point. Can QS stop people signing up just to support their employer or outvote their rivals?
To be fair, QS are aware that their surveys might be manipulated and have taken steps over the years to prevent this by such means as forbidding respondents from voting for their declared employer or repeat voting from the same computer. Even so, it seems that some universities, especially in Latin America, are getting scores in the reputation surveys that appear too high,especially when compared with their overall scores. In the employer survey the Pontifical Catholic University of Chile is 56th (overall 167) and the University of Buenos Aires 49th (overall 198). In the academic survey the University of Sao Paulo is 54th (overall 132 and the National Autonomous University of Mexico 55th (overall 175).
QS are apparently considering reforming their citations per faculty indicator and allowing unchanged responses to the surveys to be recycled for five instead of three years. This is welcome but a more rigorous overhaul of the reputation indicators is sorely needed.
Thursday, July 23, 2015
Perfect Storm Heading for Tokyo Metropolitan University
Seen on the Times Higher Education website today:
Tokyo Metropolitan University got its perfect score largely because it was one of hundreds of institutions contributing to a few publications from the Large Hadron Collider project. In their recent experimental African rankings THE started using fractionalized counting of citations. If THE use this method in the coming world rankings then TMU will surely suffer a dramatic fall in the citations indicator.
I would not like to be the president of TMU on September 30th.
Tokyo Metropolitan University
World's Best University
Scored a Perfect 100.00 for Two Years in Citations Sector
From TMU to the World
Tokyo Metropolitan University got its perfect score largely because it was one of hundreds of institutions contributing to a few publications from the Large Hadron Collider project. In their recent experimental African rankings THE started using fractionalized counting of citations. If THE use this method in the coming world rankings then TMU will surely suffer a dramatic fall in the citations indicator.
I would not like to be the president of TMU on September 30th.
Wednesday, July 22, 2015
Recommended Reading
Anybody interested in educational policy, especially the never ending campaign to close gaps of one sort or another or the oddities of university rankings should take a look at chapter four of Jordan Ellenberg's How not to be wrong: The power of mathematical thinking which is about the obvious -- or ought to be obvious observation -- that smaller populations are more variable.
He notes that South Dakota is top of the league for brain cancer while North Dakota is near the bottom. What makes the difference? It is just that the bigger the population the more likely it is that outliers will be diluted by a great mass of mediocrity. So, extreme scores tend to crop up in small places or small samples.
Similarly when he tossed coins ten at a time he came up head counts ranging from 3 to 9 out of ten.
When he tossed them 100 at a time he got counts ranging from 45 to 60.
When he (actually his computer program) tossed them 1,000 times, the counts ranged from 462 to 537.
It is worth remembering this when a study with a double digit sample is published showing the latest way to close one of achievement gaps or a very small school in a rural state somewhere starts boosting the test scores of underperforming students or a few test takers reveal that the national IQ is imploding. Or the studies fail to be replicated, if indeed anyone tries.
Or university rankings that show very small or very unproductive institutions having an enormous research impact measured by citations.
He notes that South Dakota is top of the league for brain cancer while North Dakota is near the bottom. What makes the difference? It is just that the bigger the population the more likely it is that outliers will be diluted by a great mass of mediocrity. So, extreme scores tend to crop up in small places or small samples.
Similarly when he tossed coins ten at a time he came up head counts ranging from 3 to 9 out of ten.
When he tossed them 100 at a time he got counts ranging from 45 to 60.
When he (actually his computer program) tossed them 1,000 times, the counts ranged from 462 to 537.
It is worth remembering this when a study with a double digit sample is published showing the latest way to close one of achievement gaps or a very small school in a rural state somewhere starts boosting the test scores of underperforming students or a few test takers reveal that the national IQ is imploding. Or the studies fail to be replicated, if indeed anyone tries.
Or university rankings that show very small or very unproductive institutions having an enormous research impact measured by citations.
Saturday, July 18, 2015
In the QS BRICS rankings nearly everybody gets a prize
There is a growing trend towards specialised and regional university rankings. The magic of this is that they can provide something for almost everybody. QS recently published its latest BRICS rankings which combined data from five very different university systems. The result was a triumphant (almost) success for everybody (almost).
Here are some highlights that QS could use in selling the BRICS rankings or an expanded version.
Russian universities are ahead of everybody else for teaching quality.
The top 21 universities in the BRICS for Faculty Student Ratio (perhaps not a perfect proxy for teaching excellence) are Russian, headed by Bauman Moscow State Technical University. Imagine what Russian universities could do if QS recognised the importance of teaching and increased the weighting for this indicator.
India performs excellently for Faculty with a Ph D.
Out of the top 15 for this category, ten are Indian and all of these get the maximum score of 100. Of the other five, four are Brazilian and one Chinese. If only QS realised the importance of a highly qualified faculty, India would do much better in the overall rankings.
South Africa takes five out of the first six places for international faculty.
China has four out of five top places for academic reputation and employer reputation.
Meanwhile a Brazilian university is first for international faculty and another is third for academic reputation.
It seems that with rankings like these a lot depends on the weighting assigned to the various indicators.
Yes, going to the library might be good for you
A study by Felly Chiteng Kot and Jennifer L. Jones has found that
"using a given library resource was associated with a small, but also meaningful, gain in first-term grade point average, net of other factors."
But, correlation does not necessarily mean causation. Could it be that bright people like to go to libraries?
Still, most students are likely to behave better in the library than other places, so let's not quibble too much.
"using a given library resource was associated with a small, but also meaningful, gain in first-term grade point average, net of other factors."
But, correlation does not necessarily mean causation. Could it be that bright people like to go to libraries?
Still, most students are likely to behave better in the library than other places, so let's not quibble too much.
Tuesday, July 14, 2015
Implications of the THE African Pilot Ranking
The most interesting thing about THE's experimental African ranking is the use of fractionalised counting of citations. This means that the total number of citations is divided by the number of institutions contributing to a publication. Previously, the method used in THE rankings was to assign the total citations of a paper to all of the institutions that contributed just as though each one had been the only contributor. This has produced some very questionable results with universities that were excellent but very specialised or just generally unproductive scoring remarkably high scores for citations. Panjab university, Tokyo Metropolitan University, Scuola Normale Superiore Pisa, Federico Santa Maria Technical University and Moscow State Engineering Physics Institute have all had moments of glory in the THE rankings because of citation scores that were dramatically higher than their scores for research or any other indicator or group of indicators.
The new method, if applied generally, is likely to see a significant reduction in the scores given to such universities. We can estimate what might happen by looking at the four universities that are included in both the African pilot ranking and last year's world rankings, Cape Town, Witwatersrand, Stellenbosch and Universite Marrakesh Cadi Ayyad, Morocco.
In the world rankings these universities received scores of 86.6, 67.3, 45.6 and 83 respectively. The score of 83 for Universite Cadi Ayyad resulted very largely from its contributions to several publications from the Large Hadron Collider project, one of which has been cited over 2,000 times, a low overall output of papers and the "regional modification" that gave a big boost to low scoring countries. The scores for the three South African universities reflected a larger total output and citations over a broad range of disciplines.
In the African pilot ranking the scores for citations were Cape Town 90.90, Witwatersrand 99.76, Stellenbosch 95.48 and Cadi Ayyad 78.61. The high scores for the South African institutions reflect a much lower mean score than in the world rankings.
The fall in Cadi Ayyad's citation score from 3.6 points below Cape Town to 21.3 below and its falling behind Stellenbosch and Witwatersrand presumably reflect the impact of fractionalised counting.
This suggests that if fractionalised counting is used in the coming World University Rankings many small or specialised institutions will suffer and there will be a lot of reshuffling.
The new method, if applied generally, is likely to see a significant reduction in the scores given to such universities. We can estimate what might happen by looking at the four universities that are included in both the African pilot ranking and last year's world rankings, Cape Town, Witwatersrand, Stellenbosch and Universite Marrakesh Cadi Ayyad, Morocco.
In the world rankings these universities received scores of 86.6, 67.3, 45.6 and 83 respectively. The score of 83 for Universite Cadi Ayyad resulted very largely from its contributions to several publications from the Large Hadron Collider project, one of which has been cited over 2,000 times, a low overall output of papers and the "regional modification" that gave a big boost to low scoring countries. The scores for the three South African universities reflected a larger total output and citations over a broad range of disciplines.
In the African pilot ranking the scores for citations were Cape Town 90.90, Witwatersrand 99.76, Stellenbosch 95.48 and Cadi Ayyad 78.61. The high scores for the South African institutions reflect a much lower mean score than in the world rankings.
The fall in Cadi Ayyad's citation score from 3.6 points below Cape Town to 21.3 below and its falling behind Stellenbosch and Witwatersrand presumably reflect the impact of fractionalised counting.
This suggests that if fractionalised counting is used in the coming World University Rankings many small or specialised institutions will suffer and there will be a lot of reshuffling.
Thursday, July 09, 2015
The Top University in Africa is ...
... the University of Cape Town. What a surprise!
Times Higher Education (THE) has produced another "snapshot" ranking. This one is a list of 15 African universities ranked according to "research influence", that is the number of citations per paper normalised by field and year. It seems that a larger list will be published at a THE summit at the university of Johannesburg scheduled for the end of this month. Then, apparently, there will be discussions about full rankings with a broad range of indicators.
This is a smart move. Apart from diluting the impact of the QS BRICS rankings, this table puts the summit host in the top ten and gets attention from around the continent with three places in the north, two in the west and two in the east in the top fifteen.
Here is the top 15:
1. University of Cape Town, South Africa
2. University of the Witwatersrand, South Africa
3. Makerere University, Uganda
4. Stellenbosch University, South Africa
5. University of Kwazulu-Natal, South Africa
6. University of Port Harcourt, Nigeria
7. University of the Western Cape, South Africa
8. University of Nairobi, Kenya
9. University of Johannesburg, South Africa
10. Universite Cadi Ayyad, Morrocco
11. University of Pretoria, South Africa
12. University of Ghana
13. University of South Africa
14. Suez Canal University, Egypt
15. Universite Hassan II, Morrocco.
This is, of course, just one indicator but even so there will be a few academic eyebrows rising around the continent. Makerere has a good national and regional reputation but does it have more research influence than all but two South African universities?
How come Suez Canal University is there but not Cairo University or the American University in Cairo? And I am sure that in Nigeria there will be a lot of smirking around Ahmadu Bello and Ibadan Universities about Port Harcourt in sixth place.
One very good thing about this "experimental and preliminary ranking" is that THE and data provider Scopus are now using fractionalised counting of citations, so that if 100 universities contribute to a publication they each get credit for one hundredth of the citations.
That has not stopped Makerere and Port Harcourt from getting a boost, perhaps too much of a boost, for taking part in a huge multinational medical study but it has reduced the distortions that this indicator can cause.
So, for once, well done THE!... Now, what about taking a look at secondary affiliations?
Saturday, July 04, 2015
Is this really happening?
If this continues France will be the least intelligent country in the world in a century.
Drawing straight lines on graphs and getting excited about tiny samples can be dangerous. Even so, this is a little frightening.
James Thompson's blog notes a study by Edward Dutton and Richard Lynn that suggests that the French national average IQ declined by nearly 4 points in a decade. The sample size was only 79 so we should not start panicking too much until there are a few more studies. It will be interesting to see if this one is replicated.
Drawing straight lines on graphs and getting excited about tiny samples can be dangerous. Even so, this is a little frightening.
James Thompson's blog notes a study by Edward Dutton and Richard Lynn that suggests that the French national average IQ declined by nearly 4 points in a decade. The sample size was only 79 so we should not start panicking too much until there are a few more studies. It will be interesting to see if this one is replicated.
Thursday, July 02, 2015
Now the British Academy Sees a Problem
Yesterday I referred to the poor numeracy skills of British (England and Northern Ireland) tertiary graduates reported by the OECD.
Now the British Academy has had its say. It reports that the performance of British school pupils is mediocre and that many undergraduate students are weak in statistics.
But it looks like middling (compared to the OECD average) secondary school students become almost rock bottom (in the OECD) tertiary graduates. Could it be that British universities are actually subtracting relative value from their students?
The Academy notes:
The Academy proposes a strategy that includes improving the quality of quantitative skills teaching, reviewing school curricula and addressing the early dropping of maths by secondary school students.
I suspect that that will be insufficient.
Now the British Academy has had its say. It reports that the performance of British school pupils is mediocre and that many undergraduate students are weak in statistics.
But it looks like middling (compared to the OECD average) secondary school students become almost rock bottom (in the OECD) tertiary graduates. Could it be that British universities are actually subtracting relative value from their students?
The Academy notes:
Our school pupils tend to be ranked only in the middle of developed nations in mathematics. Our undergraduates embark on degree courses with varying, and often weak, fluency in statistics. And, in the workplace, demand for more advanced quantitative skills has risen sharply in the past two decades.Perhaps this has something to do with relatively high graduation rates at British universities so that mediocre students with weak numeracy skills will be recorded as tertiary graduates while their counterparts in most of the OECD will drop out and remain classified as secondary graduate. Even if that were the case the mediocrity of secondary students and tertiary graduates would still need to be addressed.
The Academy proposes a strategy that includes improving the quality of quantitative skills teaching, reviewing school curricula and addressing the early dropping of maths by secondary school students.
I suspect that that will be insufficient.
Wednesday, July 01, 2015
Today India, Tomorrow Japan, Then ....
The ranking businesses are extending their global tentacles. Times Higher Education has produced a "snapshot" MENA ranking that produced interesting results -- Texas A&M University Qatar top for research impact -- and will be announcing their world rankings from Melbourne.
Meanwhile, QS will be in India next week to reveal their latest BRICS rankings and has been getting attention in new places for its subject rankings that get to places other rankings won't go.
With QS getting more international, it is no surprise to hear that Mitsui & Co, Ltd, has purchased shares in QS:
Meanwhile, QS will be in India next week to reveal their latest BRICS rankings and has been getting attention in new places for its subject rankings that get to places other rankings won't go.
With QS getting more international, it is no surprise to hear that Mitsui & Co, Ltd, has purchased shares in QS:
'Nunzio Quacquarelli, CEO of QS said the investment from Mutsui “can especially support our development in Asia” adding, “we were seeking and have found a likeminded company which shares our long term vision” '.This is not the first sign of Mitsui's interest in tertiary education:
'Last year the company also invested $5m in Synergis Education, an education company specialising in online and on the ground adult learner programmes.
“We aim to use our experience in the online education field to create new services,” said Takeshi Akutsu, GM of Mitsui Service Business Division in a statement.
“At the same time, through this business we will help to nurture the global human resources needed by the global economy.” '
I wonder if QS will try and start ranking online courses.next.
Tuesday, June 30, 2015
What's the Problem with U-Multirank and AHELO?
In a recent post, I discussed the contrast between the poor skills of young people in the UK (strictly speaking Northern Ireland and England) and the high regard in which British universities are held by the brand name rankers.
There is a piece of data in the skills report from the OECD that is interesting in this respect. Figure 2.2 shows the average numeracy skills of new graduates (age 16-29, 2012). It is depressing reading. The data for tertiary graduates shows that only Italy does worse than the UK and Ireland is either the same or almost the same.The US is very slightly ahead. The top scorers are Austria, Flanders and the Czech Republic.
Something that should have everybody running around doing research and forming committees is that British tertiary graduates are only very slightly better than most European secondary graduates and slightly better than South Koreans with less than an upper secondary qualification.
It is possible, indeed quite probable, that British tertiary graduates do better on verbal skills and likely that they could conduct themselves well in interviews. Perhaps also, it is places like the University of East London and Bolton University that are dragging down the British average. But this dramatically poor performance is such a glaring contrast to the preening self satisfaction of the higher education establishment that some discussion at least is called for.
We may be seeing an explanation for the reluctance of the Russell Group and its orbiters and the Ivy League to cooperate with U-Multirank and their disdain for the AHELO project that is in marked contrast with their support for the trusted and prestigious THE rankings. They are quite happy to be assessed on reputation, resources, income and citations but comparison with the cognitive skills of graduates from the upstarts of East Asia and perhaps Eastern and Central Europe is something to be avoided.
There is a piece of data in the skills report from the OECD that is interesting in this respect. Figure 2.2 shows the average numeracy skills of new graduates (age 16-29, 2012). It is depressing reading. The data for tertiary graduates shows that only Italy does worse than the UK and Ireland is either the same or almost the same.The US is very slightly ahead. The top scorers are Austria, Flanders and the Czech Republic.
Something that should have everybody running around doing research and forming committees is that British tertiary graduates are only very slightly better than most European secondary graduates and slightly better than South Koreans with less than an upper secondary qualification.
It is possible, indeed quite probable, that British tertiary graduates do better on verbal skills and likely that they could conduct themselves well in interviews. Perhaps also, it is places like the University of East London and Bolton University that are dragging down the British average. But this dramatically poor performance is such a glaring contrast to the preening self satisfaction of the higher education establishment that some discussion at least is called for.
We may be seeing an explanation for the reluctance of the Russell Group and its orbiters and the Ivy League to cooperate with U-Multirank and their disdain for the AHELO project that is in marked contrast with their support for the trusted and prestigious THE rankings. They are quite happy to be assessed on reputation, resources, income and citations but comparison with the cognitive skills of graduates from the upstarts of East Asia and perhaps Eastern and Central Europe is something to be avoided.
Saturday, June 27, 2015
Why Russia Might Rise Fairly Quickly in the Rankings After Falling a Bit
An article by Alex Usher in Higher Education in Russia and Beyond, reprinted in University World News, suggests five structural reasons why Russian universities will not rise very quickly in the global rankings. These are:
- the concentration of resources in academies rather than universities
- excessive specialisation among existing universities
- a shortage of researchers caused by the economic crisis of the nineties
- excessive bureaucratic control over research projects
- limited fluency in English.
Over the next couple of years things might even get a bit worse. QS are considering introducing a sensible form of field normalisation, just for the five main subject groups. This might not happen since they are well aware of the further advantages this will give to English speaking universities, especially Oxbridge and places like Yale and Princeton, that are strong in the humanities and social sciences. But if it did it would not be good for Russian universities. Meanwhile, THE has spoken about doing something about hugely cited multi-authored physics papers and that could drastically affect institutions like MEPhI.
But after that, there are special features in the QS and THE world rankings that could be exploited by Russian universities.
Russia is surrounded by former Soviet countries where Russian is widely used and which could provide large numbers of international research collaborators, an indicator in the THE rankings, and could be a source of international students and faculty, indicators in the THE and QS rankings and a source of respondents to the THE and QS academic surveys.
Russia might also consider tapping the Chinese supply of bright students for STEM subjects. It is likely that the red bourgeoisie will start wondering about the wisdom of sending their heirs to universities that give academic credit for things like walking around with a mattress or not shaving armpit hair and think about a degree in engineering from Moscow State or MEPhI.
Russian universities also appear to have a strong bias towards applied sciences and vocational training that should, if marketed properly, produce high scores in the QS employer survey and the THE Industry Income: Innovation indicator.
Friday, June 26, 2015
Italy and France Accept Gaokao Scores
What will happen when universities find gaokao is a better predictor of academic ability than A levels or SAT?
This is from YIBADA.
"Up to 1,000 universities in France, Italy, and other 14 popular overseas destinations for Chinese applicants are now accepting national college entrance test scores or "gaokao" scores as admission criteria, according to a report published on Monday by MyOffer, a London-based online student placement portal.
The findings reflect the growing international recognition for China's national college entrance tests despite lagging behind other exams.
MyOffer, which helps international students with university placements, overseas internships and career development, released the study as this year's "gaokao" scores were announced in several parts of China.
Earlier reports claimed that "gaokao" test results were accepted in 20 countries and regions, but MyOffer's study has by far the most detailed findings available."
Sunday, June 21, 2015
Today, Cuba and China, Tomorrow North Korea?
Another sign of the growing desperation of American colleges to find international students to take the courses American students just won't take is the four Cuban students who will take the TOEFL in Havana a week from now. There are plans for the GRE to be offered in Cuba in October.
Saturday, June 20, 2015
The Implications of the University of San Francisco Accepting Gaokao scores.
The University of San Francisco has announced that it will admit a limited number of students on the basis of their scores on the Gaokao, the rigorous Chinese national university entrance exam, plus an interview and English language test in Beijing. The candidates will be spared the necessity of taking TOEFL prep courses and flying to Hong Kong or Singapore for the SAT test.
American and British universities are running out of students capable of taking tertiary education courses. Average cognitive skills of local students are stagnant or declining, which explains the obsession of universities with finding students from overseas to bring in revenue and balance the books. China appears to have a large number of students capable of high achievement in numeracy-based fields.
What would happen if American universities found that Gaokao scores were more predictive of academic success than a dumbed down SAT? What if the English language component turned out to be just as good a measure of language proficiency as IELTS or TOEFL? The consequence might be that the Gaokao could become the normal route for admission to universities outside China.
And looking ahead several decades, what would happen if the Gaokao was offered in languages other than Chinese with test centres being set up outside China?
American and British universities are running out of students capable of taking tertiary education courses. Average cognitive skills of local students are stagnant or declining, which explains the obsession of universities with finding students from overseas to bring in revenue and balance the books. China appears to have a large number of students capable of high achievement in numeracy-based fields.
What would happen if American universities found that Gaokao scores were more predictive of academic success than a dumbed down SAT? What if the English language component turned out to be just as good a measure of language proficiency as IELTS or TOEFL? The consequence might be that the Gaokao could become the normal route for admission to universities outside China.
And looking ahead several decades, what would happen if the Gaokao was offered in languages other than Chinese with test centres being set up outside China?
Thursday, June 18, 2015
Which is the Real Fraud?
The Australian via Inside higher Ed has an article by Kylar Loussikian about a shadowy organisation apparently based in Colchester, England, that supplies ghostwritten academic essays. Australian universities, and maybe others, are getting very concerned about the racket.
'The most common issue, ghostwritten essays, represents a “wicked problem,” said John Shields, deputy dean of the University of Sydney’s business school. “It’s deep and embedded and it’s hard to catch and kill,” he said. “In one sense, ghostwriting has emerged as an area of key concern in academic honesty because many universities are using a first-line defense in terms of [text matching software], and the simple plagiarism approach being detectable has forced those who, for whatever reason, choose to engage in dishonest conduct, to go one level deeper.” '
No doubt there will be a lot of finger pointing and tongue wagging. But are companies like these the real frauds? When millions of students are unable to do the work in courses for which they have been selected shouldn't we conclude that the entire admission process is flawed?
Why are there people capable of turning out essays and papers at a few hours or days notice not employed in universities? Doesn't this suggest that that there is a problem with the recruitment process?
Meanwhile the ghost writing virus seems to be spreading to graduate and faculty research. In the last few weeks I have received messages from Gulf Dissertation Online, which has "expertly helped and consulted PhD Professors, Lecturers and Scholars with their Thesis, Dissertations and Research Papers for over 12 Years" and Publish Pedia, which "is now offering a unique opportunity to Scholars and Professors who are pursuing their first publication ISI indexed journal or due to insufficient time not able to follow up on their new papers for publication to high impact factor top tier journals keeping the mandatory guidelines for ISI journal approved by the University"
Tuesday, June 16, 2015
The British Paradox Again
We have been told many times before that British universities are punching above their weight and are outperforming their international counterparts. Year after year they do extremely well in the QS and THE world rankings although perhaps not as well in the Shanghai ARWU.
This excellent performance is in glaring contrast to the well documented decline in the cognitive skills of young people in the United Kingdom. A recent publication from the OECD on youth, skills and employability show that the proportion of 16-39 olds in the UK (actually England and Northern Ireland) with low literacy skills was well above the OECD average and slightly above the United States. Only Spain and Italy did worse. Not unexpectedly, the top performers here were Japan, Korea and Finland.
What is even more frightening is that the UK is very distinctive in that the proportion of 16-29 olds with poor literacy skills is lower than that of 30-54 years. In every other country except Japan, where literacy is very high among both groups, and Norway, literacy has risen among the younger generation.
For numeracy skills of 16-29 year olds, the UK is again well below the OECD average. The share of young people with limited numeracy is higher than any other country except Italy and the US. Again there is a decline from the 30-54 year olds.
The OECD has also published data on problem solving abilities in technology-rich environments. This time the UK, like every country assessed, has improved over time but still is behind everyone else except the US, Ireland and Poland.
So how can British students be so bad at literacy, numeracy and problem solving when the universities are, according to international rankers, so brilliant?
Some suggestions.
Perhaps, the rankings are biased towards British universities.
Perhaps, British higher education is highly differentiated with a few outstanding institutions that get high scores in the global league tables and a mass of others that cannot even squeeze into the 400s or 500s or do not even try.
Perhaps, it is just a question of time and in the next few years British universities will collapse under the weight of thousands of students with low cognitive skills who must be admitted to keep revenues flowing.
This excellent performance is in glaring contrast to the well documented decline in the cognitive skills of young people in the United Kingdom. A recent publication from the OECD on youth, skills and employability show that the proportion of 16-39 olds in the UK (actually England and Northern Ireland) with low literacy skills was well above the OECD average and slightly above the United States. Only Spain and Italy did worse. Not unexpectedly, the top performers here were Japan, Korea and Finland.
What is even more frightening is that the UK is very distinctive in that the proportion of 16-29 olds with poor literacy skills is lower than that of 30-54 years. In every other country except Japan, where literacy is very high among both groups, and Norway, literacy has risen among the younger generation.
For numeracy skills of 16-29 year olds, the UK is again well below the OECD average. The share of young people with limited numeracy is higher than any other country except Italy and the US. Again there is a decline from the 30-54 year olds.
The OECD has also published data on problem solving abilities in technology-rich environments. This time the UK, like every country assessed, has improved over time but still is behind everyone else except the US, Ireland and Poland.
So how can British students be so bad at literacy, numeracy and problem solving when the universities are, according to international rankers, so brilliant?
Some suggestions.
Perhaps, the rankings are biased towards British universities.
Perhaps, British higher education is highly differentiated with a few outstanding institutions that get high scores in the global league tables and a mass of others that cannot even squeeze into the 400s or 500s or do not even try.
Perhaps, it is just a question of time and in the next few years British universities will collapse under the weight of thousands of students with low cognitive skills who must be admitted to keep revenues flowing.
Monday, June 08, 2015
Why is Bogazici University considered so great in Turkey although it actually is at 400th position in the QS world rankings?
Another question from Quora.
The answer is that the QS rankings favour universities with an established reputation in those countries that are interested in rankings, those that have extensive international linkages, those with a lot of faculty and those with strengths in medical research.
In contrast, the Times Higher Education (THE) rankings favour those powered by hadron driven citations and with the good fortune to be located in countries where most universities produce few citations.
What will happen if THE does reform its citations indicator?
The answer is that the QS rankings favour universities with an established reputation in those countries that are interested in rankings, those that have extensive international linkages, those with a lot of faculty and those with strengths in medical research.
In contrast, the Times Higher Education (THE) rankings favour those powered by hadron driven citations and with the good fortune to be located in countries where most universities produce few citations.
What will happen if THE does reform its citations indicator?
Is The QS Computer Scence Ranking Accurate?
Ben Zhao, Professor at UC Santa Barbara, doesn't think so.
I wouldn't disagree with him about the QS subject rankings, which outside the ranks of the world elite are based on very small samples of employers and academics and small numbers of citations. But it might be unfair to complain about being spammed all the time. This is probably happening because many universities are submitting his name to QS for the academic opinion survey.
As Oscar Wilde probably would have said the only thing worse than being spammed is not being spammed.
"There's a bunch of rankings, US News, Shanghai, US National Research Council, QS. Of all of these, I would probably say that QS is one of the least useful. Why do I say that? I get SPAMMED on multiple email addresses to respond to a survey on QS university rankings. I don't respond, and they just send more mail. This is NOT the behavior of a reputable organization trying to gather a legitimate view of universities and their research quality. ... "
I wouldn't disagree with him about the QS subject rankings, which outside the ranks of the world elite are based on very small samples of employers and academics and small numbers of citations. But it might be unfair to complain about being spammed all the time. This is probably happening because many universities are submitting his name to QS for the academic opinion survey.
As Oscar Wilde probably would have said the only thing worse than being spammed is not being spammed.
Wednesday, June 03, 2015
What do Indian Scientists do on Their Holidays?
The Indian Express has an interesting interview with the Vice-Chancellor of Panjab University, which Times Higher Education (THE), but nobody else, considers to be the best or second best university in India, a feat achieved by an outstanding score for citations.
Here is an extract:
Here is an extract:
"Did the four-year period, 2010-2014, counted for the Times ranking include old research papers as well?Yes. It is not about papers that came out in this period but also the papers in which PU figures and which have a high citation. It is a mix of so many things. God particle came up in 2012. So, all those papers are being cited multiple times. Every theorist is cited. So, PU was already doing well, and discovery of God particle made it even better. When there was a lull and Fermilab was closed down for a while, and they were re-building CERN, PU and TIFR went on and joined the groups in B-factory in Japan.
The thing is that you have a job in the university, you have a job for life, you can decide to sleep, still you will get the salary. These professors at PU, or those at IIT-Guwahati, TIFR people, they are conscious that their productivity should not suffer. They should continuously be valued as a member of these collaborations. So, they keep working. So, when there is a holiday, when [other] people spend time here and there,what do High energy physicists do? Class khatam hoti hai. The next day they take a flight, and go to CERN or Chicago, and there they work hard. You are actually trying to make up for the time you could not do anything because you were doing teaching. That is how international faculty values them also, and they are continuously being included."
So, the Vice-Chancellor is aware that it is the CERN project that is cause of PU's ranking success. It will be interesting to see what happens if THE does bite the unpleasant tasting bullet and introduce fractionated counting of citations.
But if PU and other Indian institutions continue to improve, even if there is a (temporary?) dip in the THE rankings, then the key to that success may be here. Indian scientists can draw a salary while sleeping if they want but they can also go to Switzerland and discover the fundamental particles of the universe if so inclined. Increasingly, western scientists are apparently expected to spend their days and nights filling out forms, applying for grants, writing teaching philosophies, attending sexual harassment seminars, making safe spaces all over the place, undergoing diversity sensitivity training and so on and so on.
Friday, May 29, 2015
University Ranking Challenge: Your starter for 5,154
Phil Baty, editor of the Times Higher Education World University Rankings, has indicated that the publication of a paper from the ATLAS and CMS experiments at the CERN Large Hadron Collider project is a challenge for rankers.
The paper in question has a total of 5,154 authors, if that is the right word, with sole or primary affiliation to 344 institutions. Of those authors 104 have a secondary affiliation. One is deceased. Under THE's current methodology every institution contributing to the paper will get credit for all the citations that the paper will receive, which is very likely to run into the thousands.
For the elite universities participating in these projects a few thousand citations will make little or no difference. But for for a small specialised institution or a large one that does little research, those citations spread out over a few hundred papers could make a big difference.
In last year's rankings places like Florida Institute of Technology, Universite Marrakesh Cadi Ayyad, Morroco, Federico Santa Maria Technical University, Chile, Bogazici University, Turkey, got implausibly high scores for citations that were were well ahead of those for the other criteria.
The paper in question does set a record for the number of contributors although the challenge is not particularly new.
At a seminar in Moscow earlier this year, Baty suggested that THE, now independent of Thomson Reuters, was considering using fractionated counting, dividing all the citations among the contributing institutions.
This would be an excellent idea and should be technically quite feasible since CWTS at Leiden University use it as their default option.
But there would be a a price to pay. The current methodology allows THE to boast that it has found a way of uncovering hitherto unnoticed pockets of excellence. It is also a selling point in THE's imperial designs of expanding into regions where there has so far been little interest in ranking, Russia, the Middle East, Africa, the BRICS. A few universities in those regions could make a splash in the rankings if they recruited, even as an adjunct, a researcher working on the LHC project.
It would be most welcome if THE does start using fractionated counting in its citation indication. Also welcome would be some other changes: not counting self-citation, reducing the weighting for the indicator, including several different methods of evaluating research impact or quality, and, especially important, getting rid of the "regional modification" that awards a bonus for being located in a low scoring country.
The paper in question has a total of 5,154 authors, if that is the right word, with sole or primary affiliation to 344 institutions. Of those authors 104 have a secondary affiliation. One is deceased. Under THE's current methodology every institution contributing to the paper will get credit for all the citations that the paper will receive, which is very likely to run into the thousands.
For the elite universities participating in these projects a few thousand citations will make little or no difference. But for for a small specialised institution or a large one that does little research, those citations spread out over a few hundred papers could make a big difference.
In last year's rankings places like Florida Institute of Technology, Universite Marrakesh Cadi Ayyad, Morroco, Federico Santa Maria Technical University, Chile, Bogazici University, Turkey, got implausibly high scores for citations that were were well ahead of those for the other criteria.
The paper in question does set a record for the number of contributors although the challenge is not particularly new.
At a seminar in Moscow earlier this year, Baty suggested that THE, now independent of Thomson Reuters, was considering using fractionated counting, dividing all the citations among the contributing institutions.
This would be an excellent idea and should be technically quite feasible since CWTS at Leiden University use it as their default option.
But there would be a a price to pay. The current methodology allows THE to boast that it has found a way of uncovering hitherto unnoticed pockets of excellence. It is also a selling point in THE's imperial designs of expanding into regions where there has so far been little interest in ranking, Russia, the Middle East, Africa, the BRICS. A few universities in those regions could make a splash in the rankings if they recruited, even as an adjunct, a researcher working on the LHC project.
It would be most welcome if THE does start using fractionated counting in its citation indication. Also welcome would be some other changes: not counting self-citation, reducing the weighting for the indicator, including several different methods of evaluating research impact or quality, and, especially important, getting rid of the "regional modification" that awards a bonus for being located in a low scoring country.
Subscribe to:
Posts (Atom)