The latest THE world rankings have just been announced. For most of the indicators there are few surprises. There are more universities from Japan in the rankings. Oxford is first, followed by Cambridge. The USA contributes the largest number of top universities. China rises steadily. India is as usual is a disappointment.
But, as in previous years, the most interesting thing is the citations indicator, which is supposed to measure research influence. Once again this has produced some very interesting results.
Here are some of the universities in the top 100.
Babol Noshirvani University of Technology: the most influential university in the world for research
Brighton and Sussex Medical School: most influential in Europe
Brandeis University: most influential in the USA
Reykjavik University
St George's, University of London: fallen a bit, probably because of Brexit
King Abdulaziz University: top university for research influence in the Middle East and Asia
Anglia Ruskin University
Jordan University of Science and Technology
Vita-Salute San Raffaele University
Ulsan National Institute for Science and Technology: top in Asia ex Middle East
University of Canberra: best in Australia
University of Dessarollo: best in Latin America
McMaster University: best in Canada
Universite de Versailles Saint-Quentin-en-Yvelines: best in France
Teikyo University: best in Japan
There are signs that THE are considering reforming this indicator. If that does happen, the rankings will be more valid but much less entertaining.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Monday, October 01, 2018
Sunday, September 30, 2018
Rankings and Higher Education Policy
Two examples of how the need to perform well in the rankings is shaping national research and higher education policy.
From the Irish Examiner
From Times Higher Education
https://www.irishexaminer.com/breakingnews/business/cern-membership-vital-for-irish-universities-872312.html
From the Irish Examiner
"Ireland must apply for membership of the world-renowned European Organisation for Nuclear Research (Cern) in order to combat the effect of Brexit and boost university rankings.
That is according to Cork senator Colm Burke as the campaign to join Cern gains momentum, after Ireland recently became a member of the European Space Observatory."
From Times Higher Education
"France’s programme of university mergers is paying off, improving the research performance and international visibility of its top providers, according to the Times Higher Education World University Rankings 2019.
Paris Sciences et Lettres – PSL Research University Paris, a 2010 merger of numerous institutions, climbed 31 places to 41st this year, becoming the first French university to feature in the top 50 best universities since 2011. PSL made its debut in the global table last year.
Its teaching and research scores improved, driven by increased global visibility and votes in the academic teaching and research reputation surveys.
Meanwhile, Sorbonne University, which was founded in January this year following the merger of Pierre and Marie Curie University and Paris-Sorbonne University, has joined the list at 73rd place – making it the highest-ranked newcomer in the table."
https://www.irishexaminer.com/breakingnews/business/cern-membership-vital-for-irish-universities-872312.html
Thursday, September 20, 2018
Philosophy Department Will Ignore GRE Scores
The philosophy department at the University of Pennsylvania has taken a step away from fairness and objectivity in university admissions. It will no longer look at the GRE scores of applicants to its graduate programme.
The department is good but not great. It is ranked 27th in the Leiter Report rankings and in the 101-150 band in the QS world subject rankings.
So how will students be selected without GRE scores? It seems it will be by letters of recommendation, undergraduate GPA, writing samples, admission statements.
Letters of recommendation have very little validity. The value of undergraduate grades has eroded in recent years and very likely will continue to do so. Admission essays and diversity statements say little about academic ability and a lot about political conformism.
The reasons for the move are not convincing. Paying for the GRE is supposed to be a burden on low income students. But the cost is much less than Penn's exorbitant tuition fees. It is also claimed that the GRE and other standardised tests do not predict performance in graduate school. In fact they are a reasonably good predictor of academic success although they should not be used by themselves.
Then there is the claim that the GRE "sometimes" underpredicts the performance of minorities and women. No doubt it sometimes does but then presumably sometimes it does not. Unless there is evidence that the underprediction is significant and that it is greater than that of other indicators this claim is meaningless.
What will be the result of this? The department will be able to admit students who "do not test well" but who can get good grades, something that is becoming less difficult at US colleges, or persuade letter writers at reputable schools that they will do well.
It is likely that more departments across the US will follow Penn's lead. American graduate programmes will slowly become less rigorous and less able to compete with the rising universities of Asia.
The department is good but not great. It is ranked 27th in the Leiter Report rankings and in the 101-150 band in the QS world subject rankings.
So how will students be selected without GRE scores? It seems it will be by letters of recommendation, undergraduate GPA, writing samples, admission statements.
Letters of recommendation have very little validity. The value of undergraduate grades has eroded in recent years and very likely will continue to do so. Admission essays and diversity statements say little about academic ability and a lot about political conformism.
The reasons for the move are not convincing. Paying for the GRE is supposed to be a burden on low income students. But the cost is much less than Penn's exorbitant tuition fees. It is also claimed that the GRE and other standardised tests do not predict performance in graduate school. In fact they are a reasonably good predictor of academic success although they should not be used by themselves.
Then there is the claim that the GRE "sometimes" underpredicts the performance of minorities and women. No doubt it sometimes does but then presumably sometimes it does not. Unless there is evidence that the underprediction is significant and that it is greater than that of other indicators this claim is meaningless.
What will be the result of this? The department will be able to admit students who "do not test well" but who can get good grades, something that is becoming less difficult at US colleges, or persuade letter writers at reputable schools that they will do well.
It is likely that more departments across the US will follow Penn's lead. American graduate programmes will slowly become less rigorous and less able to compete with the rising universities of Asia.
Sunday, September 09, 2018
Ranking Global Rankings: Information
Another indicator for ranking global rankings might be the amount of information that they contain. Here are 17 global rankings in the IREG Inventory ranked according to the number of indicators or groups of indicators for which scores or ranks are given. The median and the mode are both six.
The number for U-Multirank is perhaps misleading since data is not provided for all universities.
Rank
|
Ranking
|
Address
of publisher
|
Number of
indicators
|
1
|
Germany
|
112
|
|
2
|
Russia
|
20
|
|
3
|
Netherlands
|
19
|
|
4
|
USA
|
13
|
|
5
|
Taiwan
|
8
|
|
6
|
UAE
|
7
|
|
7=
|
UK
|
6
|
|
7=
|
China
|
6
|
|
7=
|
Indonesia
|
6
|
|
7=
|
URAP University
Ranking by Academic Performance
|
Turkey
|
6
|
11
|
UK
|
5
|
|
12
|
Ranking
Web of Universities (Webometrics)
|
Spain
|
4
|
13
|
Spain
|
3
|
|
14
|
UK
|
2
|
|
15=
|
France
|
1
|
|
15=
|
Reuters Top 100 Innovative
Universities
|
USA
|
1
|
15=
|
Australia
|
1
|
Monday, September 03, 2018
Ranking Global Rankings: Inclusion
The number of international global universities continues to grow and it is becoming harder to keep track of them. Earlier this year IREG published an inventory of international rankings that included 17 global rankings. Here are those rankings in order of the number of institutions that they rank in the most recent edition.
Webometrics is the clear winner, followed by uniRank and SCImago. There are, of course, other indicators to think about and some of these will be covered later.
Webometrics is the clear winner, followed by uniRank and SCImago. There are, of course, other indicators to think about and some of these will be covered later.
Number of Institutions ranked
Rank
|
Ranking
|
Address of publisher
|
Number ranked
|
1
|
Ranking Web of Universities (Webometrics)
|
Spain
|
28,077
|
2
|
Australia
|
13,146
|
|
3
|
Spain
|
5,637
|
|
4
|
URAP University Ranking by Academic Performance
|
Turkey
|
2,500
|
5
|
U-Multirank |
Germany
|
1,500
|
6
|
USA |
1,250
|
|
7
|
THE World University Rankings |
UK
|
1,000+
|
8= | Shanghai Ranking ARWU | China |
1,000
|
8= | CWUR University Rankings |
UAE
|
1,000
|
10
|
QS World University Rankings |
UK
|
916 |
11
|
CWTS Leiden Ranking | Netherlands | 903 |
12
|
Taiwan | 800 | |
13
|
Russia
|
783
|
|
14
|
UI GreenMetric Ranking | Indonesia | 619 |
15
|
UK | 500 | |
16
|
France
|
150
|
|
17
|
Reuters Top 100 Innovative Universities
|
USA
|
100
|
Sunday, September 02, 2018
Ranking US Rankings
Forbes Magazine has an article by Willard Dix that ranks US ranking sites. The ranking is informal without specifying indicators but the author does give us an idea of what he thinks a good ranking should do.
Here are the top five of thirteen:
1. US News: America's Best Colleges
2. Money magazine: Best Colleges Ranking
3. Forbes: America's Top Colleges
4. Kiplinger's Best College Values
5. Washington Monthly: College Guide and Rankings.
Reading through the comments it is possible to get an idea of the criteria of a good ranking. Rankings should contain a lot of information, they should be comprehensive and include a large number of institutions, they should provide data that helps prospective students and stakeholders, they should be published for several years, if they use surveys they should have a lot of respondents, they should have face validity (a list with a "revolutionary algorithm" that puts non-Ivy places at the top is in 13th place).
Here are the top five of thirteen:
1. US News: America's Best Colleges
2. Money magazine: Best Colleges Ranking
3. Forbes: America's Top Colleges
4. Kiplinger's Best College Values
5. Washington Monthly: College Guide and Rankings.
Reading through the comments it is possible to get an idea of the criteria of a good ranking. Rankings should contain a lot of information, they should be comprehensive and include a large number of institutions, they should provide data that helps prospective students and stakeholders, they should be published for several years, if they use surveys they should have a lot of respondents, they should have face validity (a list with a "revolutionary algorithm" that puts non-Ivy places at the top is in 13th place).
Friday, August 24, 2018
Why is Australia doing well in the Shanghai rankings?
I am feeling a bit embarrassed. In a recent post I wrote about the Shanghai Rankings (ARWU) being a bit boring (which is good) because university ranks usually do not change very much. But then I noticed that a couple of Australian universities did very well in the latest rankings. One of them, the Australian National University (ANU), has risen a spectacular (for ARWU) 31 places over last year. The Financial Review says that "[u]niversity scientific research has boosted the position of two Australian universities in a global ranking of higher education providers."
The ranking is ARWU and the rise in the ranking is linked to the economic contribution of Australian universities, especially those in the Group of Eight.
So how well did Australian universities do? The top performer, as in previous years, is the University of Melbourne, which went up a spot to 38th place. Two other universities went up a lot in a very un-Shanghainese way, ANU, already mentioned, from 69th to 38th place and the University of Sydney from to 83rd to 68th
The University of Queensland was unchanged in 55th place while Monash fell from 78th to 91st and the University of Western Australia from 91st to 93rd.
How did ANU and Sydney do it? The ANU scores for Nobel and Fields awards were unchanged. Publications were up a bit and papers in Nature and Science down a bit.
What made the difference was the score for highly cited researchers, derived from lists kept by Clarivate Analytics, which rose from 15.4 to 23.5, a difference of 8.1 or, after weighting, 1.62 points of the overall score. The difference in total scores between 2017 and 2018 was 1.9 so those highly cited researchers made up most of the difference.
In 2016 ANU had two researchers in the list, which was used for the 2017 rankings. One was also on the 2017 list, used in 2018. In 2017 there were six ANU highly cited researchers, one from the previous year and one who had moved from MIT. The other four were long serving ANU researchers.
Let's be clear. ANU has not been handing out unusual contracts or poaching from other institutions. It has grown its own researchers and should be congratulated.
But using an indicator where a single researcher can lift a top 100 university seven or eight places is an invitation to perverse consequences. ARWU should consider whether it is time to explore other measures of research impact.
The improved scores for the University of Sydney resulted from an increase between 2016 and 2017 in the number of articles published in the Science Citation Index Expanded and the Social Science Citation Index.
The ranking is ARWU and the rise in the ranking is linked to the economic contribution of Australian universities, especially those in the Group of Eight.
So how well did Australian universities do? The top performer, as in previous years, is the University of Melbourne, which went up a spot to 38th place. Two other universities went up a lot in a very un-Shanghainese way, ANU, already mentioned, from 69th to 38th place and the University of Sydney from to 83rd to 68th
The University of Queensland was unchanged in 55th place while Monash fell from 78th to 91st and the University of Western Australia from 91st to 93rd.
How did ANU and Sydney do it? The ANU scores for Nobel and Fields awards were unchanged. Publications were up a bit and papers in Nature and Science down a bit.
What made the difference was the score for highly cited researchers, derived from lists kept by Clarivate Analytics, which rose from 15.4 to 23.5, a difference of 8.1 or, after weighting, 1.62 points of the overall score. The difference in total scores between 2017 and 2018 was 1.9 so those highly cited researchers made up most of the difference.
In 2016 ANU had two researchers in the list, which was used for the 2017 rankings. One was also on the 2017 list, used in 2018. In 2017 there were six ANU highly cited researchers, one from the previous year and one who had moved from MIT. The other four were long serving ANU researchers.
Let's be clear. ANU has not been handing out unusual contracts or poaching from other institutions. It has grown its own researchers and should be congratulated.
But using an indicator where a single researcher can lift a top 100 university seven or eight places is an invitation to perverse consequences. ARWU should consider whether it is time to explore other measures of research impact.
The improved scores for the University of Sydney resulted from an increase between 2016 and 2017 in the number of articles published in the Science Citation Index Expanded and the Social Science Citation Index.
Saturday, August 18, 2018
Who Cares About University rankings?
A paper by Ludo Waltman and Nees Jan van Eck asks what users of the Leiden Ranking are interested in. There's some interesting stuff but for now I just want to look at where the users come from.
The top ten countries where visitors originate are:
1. USA
2. Australia
3. Netherlands
4. UK
5. Turkey
6. Iran
7. South Korea
8. France
9. Germany
10. Denmark.
The authors consider the number of visitors from Australia, Turkey, Iran and South Korea to be "quite remarkable."
Let's look at other signs of interest in rankings. Here are the top countries for respondents to the 2018 QS academic survey:
1. USA
2. UK
3. Malaysia
4= Australia
4= South Korea
4= Russia
7= Italy
7= Japan
9= Brazil
9= Canada
And here are the top ten countries for visitors to this blog:
1. USA
2. UK
3. Russia
4. France
5. Germany
6. Ukraine
7. Canada
8. Malaysia
9. Australia
10. Singapore.
The three countries on all three lists are UK, USA and Australia. The countries on two lists are South Korea, Russia, Malaysia, France, Germany and Canada.
https://www.cwts.nl/blog?article=n-r2s2a4&title=what-are-users-of-the-cwts-leiden-ranking-interested-in
http://rankingwatch.blogspot.com/2018/06/responses-to-qs.html
The top ten countries where visitors originate are:
1. USA
2. Australia
3. Netherlands
4. UK
5. Turkey
6. Iran
7. South Korea
8. France
9. Germany
10. Denmark.
The authors consider the number of visitors from Australia, Turkey, Iran and South Korea to be "quite remarkable."
Let's look at other signs of interest in rankings. Here are the top countries for respondents to the 2018 QS academic survey:
1. USA
2. UK
3. Malaysia
4= Australia
4= South Korea
4= Russia
7= Italy
7= Japan
9= Brazil
9= Canada
And here are the top ten countries for visitors to this blog:
1. USA
2. UK
3. Russia
4. France
5. Germany
6. Ukraine
7. Canada
8. Malaysia
9. Australia
10. Singapore.
The three countries on all three lists are UK, USA and Australia. The countries on two lists are South Korea, Russia, Malaysia, France, Germany and Canada.
https://www.cwts.nl/blog?article=n-r2s2a4&title=what-are-users-of-the-cwts-leiden-ranking-interested-in
http://rankingwatch.blogspot.com/2018/06/responses-to-qs.html
Saturday, August 11, 2018
Will THE do something about the citations indicator?
International university rankings can be a bit boring sometimes. It is difficult to get excited about the Shanghai rankings, especially at the upper end: Chicago down two places, Peking up one. There was a bit of excitement in 2014 when there was a switch to a new list of highly cited researchers and some universities went up and down a few places, or even a few dozen, but that seems over with now.
The Times Higher Education (THE) world rankings are always fun to read, especially the citations indicator, which since 2010 has proclaimed a succession of unlikely places as having an outsize influence on the world of research: Alexandria University, Hong Kong Baptist University, Bilkent University, Royal Holloway University of London, National Research University MEPhi Moscow, Tokyo Metropolitan University, Federico Santa Maria Technical University Chile, St George's University of London, Anglia Ruskin University Cambridge, Babol Noshirvani University University of Technology Iran.
I wonder if the good and the great of the academic world ever feel uncomfortable about going to those prestigious THE summits while places like the above are deemed to be the equal for research impact or the superior of Chicago or Melbourne or Tsinghua. Do they even look at the indicator scores?
These remarkable results are not because of deliberate cheating but of THE's methodology. First, research documents are divided into 300 plus fields, five types of documents, and five years of publication, and then the world average number of citations (mean) is calculated for each type of publication in each field and in each year. Altogether there are 8000 "cells" with which the average of each university in the THE rankings is compared .
This means that if a university manages to get a few publications in a field where citations are typically low it could easily get a very high citations score.
Added to this is a "regional modification" where the final citation impact score is divided by the square root of the score of the country in which the country is located. This results in most universities receiving an increased score which is very small for those in productive countries and very high for those in countries that generate few citations. The modification is now applied to half of the citations indicator score.
Then we have the problems of those famous kilo-author mega-cited papers. These are papers with dozens, scores, or hundreds of participating institutions and similar numbers of authors and citations. Until 2015 THE treated every author as as though they were the sole author of a paper, including those with thousands of authors. Then in 2015 they stopped counting papers with over a thousand authors and in 2016 they introduced a modified fractional counting of citations for papers with over thousand authors. Citations were distributed proportionally among the authors with a minimum allotment of five per cent.
There are problems with all of these procedures. Treating every authors as as the sole author meant that a few places can get massive citation counts from taking part in one or two projects such as the CERN project or the global burden of disease study . On the other hand excluding mega papers is also not helpful since it omits some of the most significant current research.
The simplest solution would be fractional counting all around, just dividing the number of citations of all papers by the numbers of contributors or contributing institutions. This is the default option of Leiden Ranking and there seems no compelling reason why THE could not so.
There are some other issues that should be dealt with. One is the question of self-citation. This is probably not a widespread issue but it has caused problems on a couple of occasions.
Something else that THE might want to think about is the effect of the rise of in the number of authors with multiple affiliations. So far only one university has recruited large numbers of adjunct staff whose main function seems to be listing the university as a secondary affiliation at the top of published papers but there could be more in the future.
Of course, none of this would matter very much if the citations indicator were given a reasonable weighting of, say, five or ten percent but it has more weight than any other indicator -- the next is the research reputation survey with 18 %. A single mega-paper or even a few strategically placed citations in a low cited field can have a huge impact on a university's overallscore.
There are signs that THE is getting embarrassed at the bizarre effects of this indicator. Last year Phil Baty, THE's ranking editor, spoke about its quirky results.
Recently, Duncan Ross, data director at THE, has written about the possibility of of a methodological change. He notes that currently the benchmark world score for the 8000 plus cells is determined by the mean. He speculates about using the median instead. The problem with this is that a majority of papers are never cited so the median for many of the cells is going to be zero. So he proposes, based on an analysis from the recent THE Latin American rankings, that the 75th percentile be used.
Ross suggests that this would make the THE rankings more stable, especially the Latin American rankings where the threshold number of articles is quite low.
It would also allow the inclusion of more universities that currently fall below the threshold. This, I suspect, is something that is likely to appeal to the THE management.
It is very good that THE appears willing to think about reforming the citations indicator. But a bit of tweaking will not be enough.
Sunday, July 22, 2018
The flight from bright: Dartmouth wants nice MBA students
The retreat from intelligence as a qualification for entrance into American universities continues. We have already seen the University of Chicago join the ranks of test-optional colleges and it seems that for many years Harvard has been discriminating against prospective Asian students who supposedly lack the qualities of grit, humour, sensitivity, kindness, courage, and leadership that are necessary to study physics or do research in economics.
There has been a lot of indignation about the implication that Harvard should actually think that Asians were uniquely lacking in humour and grit and so on.
But even if Asians were lacking in these qualities that is surely no reason to deny them admission to elite institutions if they have the ability to perform at the highest intellectual level. Sensitivity, kindness and a sense of humour etc are no doubt desirable but they are highly subjective, culture specific, difficult to operationalise and almost impossible to assess with any degree of validity. They also could have a disparate impact on racial, gender and ethnic groups.
Now Dartmouth College is going down the same path. What do you need to get into the Tuck School of Business?
"True to the school’s long-held reputation for being applicant-friendly and transparent in its admissions process, the new, simplified criteria comprise four attributes reflective of successful Tuck students: smart, nice, accomplished, and aware."
I doubt that Dartmouth will be the only place to admit students because they are nice, or good at pretending to be nice or able to afford niceness trainers. And how will niceness be assessed?
There will be an essay: "Tuck students are nice, and invest generously in one another's success. Share an example of how you have helped someone else succeed. (500 words)."
Referees will be asked: "Tuck student are nice. Please comment on how the candidate interacts with others including when the interaction is difficult or challenging."
Soon no doubt we will hear demands for niceness of students to be included as in indicator in university rankings. There will be compulsory workshops on how to confront the nastiness within. Studies will show that niceness is an essential attribute for success in research, business, sport, war and journalism and that it is something in which ciswhitestraightmales, especially those not differently abled, are desperately deficient.
And we are likely to see articles wondering why Asian universities are mysteriously overtaking the West in anything based on cognitive skills.
There has been a lot of indignation about the implication that Harvard should actually think that Asians were uniquely lacking in humour and grit and so on.
But even if Asians were lacking in these qualities that is surely no reason to deny them admission to elite institutions if they have the ability to perform at the highest intellectual level. Sensitivity, kindness and a sense of humour etc are no doubt desirable but they are highly subjective, culture specific, difficult to operationalise and almost impossible to assess with any degree of validity. They also could have a disparate impact on racial, gender and ethnic groups.
Now Dartmouth College is going down the same path. What do you need to get into the Tuck School of Business?
"True to the school’s long-held reputation for being applicant-friendly and transparent in its admissions process, the new, simplified criteria comprise four attributes reflective of successful Tuck students: smart, nice, accomplished, and aware."
There will be an essay: "Tuck students are nice, and invest generously in one another's success. Share an example of how you have helped someone else succeed. (500 words)."
Referees will be asked: "Tuck student are nice. Please comment on how the candidate interacts with others including when the interaction is difficult or challenging."
Soon no doubt we will hear demands for niceness of students to be included as in indicator in university rankings. There will be compulsory workshops on how to confront the nastiness within. Studies will show that niceness is an essential attribute for success in research, business, sport, war and journalism and that it is something in which ciswhitestraightmales, especially those not differently abled, are desperately deficient.
And we are likely to see articles wondering why Asian universities are mysteriously overtaking the West in anything based on cognitive skills.
My article in University World News can be accessed here. Comments can be made at this blog.
GLOBAL
How should rankings assess teaching and learning?
Richard Holmes20 July 2018 Issue No:515 |
Subscribe to:
Posts (Atom)