The number and frequency of international university rankings is constantly increasing so I am starting to use a standard format.
QS BRICS Rankings
Source
Scope
Universities in Brazil, Russia, India, China, South Africa. Does not include Hong Kong, Macau or Taiwan.
Methodology
Unchanged since last year.
Academic Reputation 30%
Employer Reputation 20%
Faculty/Student Ratio 20%
Staff with a PhD 10%
Papers per Faculty 10%
Citations per Paper 5%
International Faculty 2.5%
International Students 2.5%
Top Ten
1. Tsinghua University
2. Peking University
3. Lomonosov Moscow State University
4. University of Science and Technology China
5. Fudan University
6. Nanjing University
7. Universidade de Sao Paulo
8. Shanghai Jiao Tong University
9= Universidade Estadual de Campinas
9= University of Cape Town
Countries with Universities in the Top Hundred
China 40
Brazil 19
Russia 18
India 15
South Africa 8
Selected Significant Changes
Harbin Institute of Technology down from 23rd to 27th
Wuhan University down from 26th to 33rd
Tomsk State University up from 58th to 47th
Manipal Academy of Higher Education up from 100th to 85th.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, June 26, 2014
Wednesday, June 25, 2014
Off Topic: Should Cambridge and ICL Emulate Wayne State?
This is from the Independent, which is supposed to be a serious newspaper, 18 June.
"British cities seeking to adapt to the realities of the new global economy should model their plans on the success of United States conurbations including Detroit, a former urban development advisor to President Obama has told The Independent. ...
But Bruce Katz, vice president of the Washington think tank the Brookings Institution, who has advised both the Clinton and Obama White Houses on urban regeneration, said that Detroit was now part of the metro revolution that is transforming the lives of millions of citizens and rebuilding the shattered US economy."
Five days later in the Independent.
"British cities seeking to adapt to the realities of the new global economy should model their plans on the success of United States conurbations including Detroit, a former urban development advisor to President Obama has told The Independent. ...
But Bruce Katz, vice president of the Washington think tank the Brookings Institution, who has advised both the Clinton and Obama White Houses on urban regeneration, said that Detroit was now part of the metro revolution that is transforming the lives of millions of citizens and rebuilding the shattered US economy."
Five days later in the Independent.
"Activists angered by the closing of water accounts for thousands of people behind in their payments have taken their fight to the United Nations.
In March, the Detroit Water and Sewerage Department (DWSD) announced that it would start cutting off the services of homes, schools and businesses that were at least 60 days overdue or more than $150 behind.
It said it wanted to start recouping $118million owed from unpaid bills and that a fierce approach was needed to coax money from delinquent accounts, which make up almost half of the city’s total.
The move, in which as many as 3,000 properties were expected to be cut off each week, has outraged campaigners."
Monday, June 23, 2014
It Was the God Particle!
The truth about Panjab University's (PU) rise in the Times Higher Education World University Rankings -- and no other -- is revealed in the Times of India.
Shimona Kanwar notes:
A couple of things are missing though. Delhi University (DU) also joined the project but did not even get into the top 100 of the Asian rankings. How come? It wasn't those doctoral students. It was probably (we can't be certain without seeing the scores for all the indicators) because although PU had fewer citations than DU over the relevant period it also had significantly fewer papers to divide them by.
The trick to getting on in the THE rankings is not just to get lots of citations in the right field and the right year and the right country but also to make sure the total number of papers doesn't get too high.
And, as I noted yesterday, if TR, THE's data collectors, do what they have done for the Highly Cited researchers database and stop counting physics publications with more than 30 affiliations, then PU will almost certainly fall out of the rankings altogether.
Shimona Kanwar notes:
"The paper on the discovery of Higgs boson particle better known as the God particle, which earned the Nobel Prize in Physics last year, has come as blessing for Panjab University (PU). PU earned an overall score of 40.2, most of which has been contribution of citations from the university's publications. The paper on the God particle had 10,000 citations, which helped immensely give the numero uno status to PU in the country.
The Times Higher Education Asia University ranking-2014 had four parameters -teaching, international outlook, industry income, research and citations. Out of the 30% score on citations, 84.7 was the top score, which gave the university an edge over the other 21 participating universities. This included Jawaharlal Nehru University, Delhi University, Aligarh Muslim University and IIT Kharagpur among others. Though the CERN project which was associated with the discovery of the God particle involved participation from Delhi University as well, a huge number of PhD students in the project from PU apparently contributed in this rank."We had build parts of a detector, contributed for the hardware, software and physics analysis in the Compact Muon Solenoid (CMS) stage of the God particle discovery," said Prof Manjit Kaur of PU, who was part of the project.
Panjab University had 12-15 PhD students and five faculty members from the department of physics who worked in collaboration for the prestigious project."
A couple of things are missing though. Delhi University (DU) also joined the project but did not even get into the top 100 of the Asian rankings. How come? It wasn't those doctoral students. It was probably (we can't be certain without seeing the scores for all the indicators) because although PU had fewer citations than DU over the relevant period it also had significantly fewer papers to divide them by.
The trick to getting on in the THE rankings is not just to get lots of citations in the right field and the right year and the right country but also to make sure the total number of papers doesn't get too high.
And, as I noted yesterday, if TR, THE's data collectors, do what they have done for the Highly Cited researchers database and stop counting physics publications with more than 30 affiliations, then PU will almost certainly fall out of the rankings altogether.
Thursday, June 19, 2014
The New Highly Cited Researchers List
Citations have become a standard feature of global university rankings, although they are measured in very different ways. Since 2003 the Shanghai Academic Ranking of World Universities has used the list of highly cited researchers published by Thomson Reuters (TR), who have now prepared a new list of about 3,500 names to supplement the old one which has 7,000 plus.
The new list got off to a bad a start in 2013 because the preliminary list was based on a faulty procedure and because of problems with the assigning of papers to fields or subfields. This led to ARWU having to repeat the 2012 scores for their highly cited researchers indicator in their 2013 rankings.
The list contains a number of researchers who appear more than once. Just looking at the number of Harvard researchers for a few minutes, I have noticed that David M Sabatini, primary affiliation MIT with secondary affiliations at Broad Institute Harvard and MIT, is listed for Biology and Biochemistry and also for Molecular Biology and Genetics.
Eric S Lander, primary affiliations with Broad Institute Harvard and MIT and secondary affiliations with MIT and Harvard, is listed three times for Biology and Biochemistry, Clinical Medicine and Molecular Biology and Genetics.
Frank B Hu, primary affiliation with Harvard and secondary affiliation with King Abdulaziz University, Saudi Arabia, is listed under Agricultural Sciences, Clinical Medicine and Molecular Biology and Genetics.
This no doubt represents the reality of scientific research in which a single researcher might well excel in two or more closely related fields but if ARWU are just going to count the number of researchers in the new list there will be distortions if some are counted more than once.
The new list refers to achievements over the period 2002-12. Unlike the old list, which just counted the number of citations, the new one is based on normalisation by field -- 21 in this case -- and by year of publication. In other words, it is not the number of citations that matters but the numbers in relation to the world average for field and year of citation.
TR acknowledge that there is a problem resulting from the growing number of massively cited, multi-authored papers and reviews, especially in the subfields of Particle and High-Energy Physics. To deal with this issue they have excluded from the analysis papers in Physics with more than thirty institutional addresses.
I do not know if TR are planning on doing this for their data for the Times Higher Education World University Rankings. If they are, places like Panjab University are in for a nasty shock.
Another noticeable thing about the new lists is the large number of secondary affiliations. In many cases the joint affiliations seem quite legitimate. For example, there are many researchers in subjects such as Biology and Biochemistry with affiliation to an Ivy League school and a nearby hospital or research institute. On the other hand, King Abdulaziz University in Jeddah has 150 secondary affiliations. Whether Thomson Reuters or ARWU will be able to determine that these represent a genuine association is questionable.
The publication of the new lists is further evidence that citations can be used to measure very different things. It would be unwise for any ranking organisation to use only one citations based indicator or only one database.
The new list got off to a bad a start in 2013 because the preliminary list was based on a faulty procedure and because of problems with the assigning of papers to fields or subfields. This led to ARWU having to repeat the 2012 scores for their highly cited researchers indicator in their 2013 rankings.
The list contains a number of researchers who appear more than once. Just looking at the number of Harvard researchers for a few minutes, I have noticed that David M Sabatini, primary affiliation MIT with secondary affiliations at Broad Institute Harvard and MIT, is listed for Biology and Biochemistry and also for Molecular Biology and Genetics.
Eric S Lander, primary affiliations with Broad Institute Harvard and MIT and secondary affiliations with MIT and Harvard, is listed three times for Biology and Biochemistry, Clinical Medicine and Molecular Biology and Genetics.
Frank B Hu, primary affiliation with Harvard and secondary affiliation with King Abdulaziz University, Saudi Arabia, is listed under Agricultural Sciences, Clinical Medicine and Molecular Biology and Genetics.
This no doubt represents the reality of scientific research in which a single researcher might well excel in two or more closely related fields but if ARWU are just going to count the number of researchers in the new list there will be distortions if some are counted more than once.
The new list refers to achievements over the period 2002-12. Unlike the old list, which just counted the number of citations, the new one is based on normalisation by field -- 21 in this case -- and by year of publication. In other words, it is not the number of citations that matters but the numbers in relation to the world average for field and year of citation.
TR acknowledge that there is a problem resulting from the growing number of massively cited, multi-authored papers and reviews, especially in the subfields of Particle and High-Energy Physics. To deal with this issue they have excluded from the analysis papers in Physics with more than thirty institutional addresses.
I do not know if TR are planning on doing this for their data for the Times Higher Education World University Rankings. If they are, places like Panjab University are in for a nasty shock.
Another noticeable thing about the new lists is the large number of secondary affiliations. In many cases the joint affiliations seem quite legitimate. For example, there are many researchers in subjects such as Biology and Biochemistry with affiliation to an Ivy League school and a nearby hospital or research institute. On the other hand, King Abdulaziz University in Jeddah has 150 secondary affiliations. Whether Thomson Reuters or ARWU will be able to determine that these represent a genuine association is questionable.
The publication of the new lists is further evidence that citations can be used to measure very different things. It would be unwise for any ranking organisation to use only one citations based indicator or only one database.
Subscribe to:
Comments (Atom)