Diversity and Rankings
Robert Morse, director of data research at the US News and World Report discusses the question of whether "diversity" should be included in the ranking of American law schools.
"I was one of speakers on the "Closing Plenary: Reforming U.S. News Rankings to Include Diversity" panel, which discussed many of the issues pertaining to whether U.S. News should add a measure of diversity directly into the Best Law Schools rankings. I pointed out that U.S. News believes diversity is important and that is why we all ready publish a separate law school diversity index.
Our current index identifies law schools where law students are most and least likely to encounter classmates from a different racial or ethnic group. However, the current index does not measure how successful each law school is at meeting a diversity goal or benchmark at the school, state, local, or national level. It also gives schools enrolling one ethnic group a low score, though that school's enrolment may match its state's ethnic population or the school may be a Historically Black College or University. It's for these reasons the current index would not be appropriate to add into the rankings".
Diversity here does not mean diversity of ideology, religion, class, politics or nationality. It simply means the numbers of recognised minorities, mainly African-Americans, Hispanics and Asian Americans.
It is interesting to look at the diversity index and to see the likely effect of including diversity in the law school rankings. The most diverse law school is the University of Hawaii. The University of the District of Columbia and Florida International University also get high scores. Low scorers include Harvard, Yale and UCLA.
Somehow, I do not think that an indicator that benefited Florida International University at the expense of Harvard would add to the credibility of these rankings.
Unless it can be demonstrated that there is something magically transforming about the statistical profile of a law school reflecting that of its city, state, or nation or future nation, this proposal does not sound like a very good idea.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, December 01, 2011
The Utility of Rankings
Another advantage of a good performance in international university rankings is that graduates will be able to get into Russian postgraduate programs (if your university is in a G8 country).
Russia’s education ministry is currently drawing up a list of foreign universities whose qualifications will be recognized.
The list will include only universities located within the G8 countries that enter the top 300 in the Academic Ranking of World Universities or the QS World University Rankings. Officials say there will be around 300 institutions meeting the criteria.
The reform is intended to attract more students to take part in Russian MA and PhD programs.
Saturday, November 26, 2011
Update 5 on El Naschie vs Nature
The BBC has another piece on the Nature case by Pallab Ghosh. It seems that El Naschie is now admitting that his papers were not peer reviewed but argues that this was because he had no peers who could review them:
He said that he would discuss his papers with fellow scientists, and only when he thought that they were of a sufficiently high standard would he publish them. "I am too arrogant and have too much self-respect to allow a bad paper to pass through," he said.
Prof El Naschie called one witness, Prof Otto Rossler - an honorary editor of Chaos, Solitons and Fractals.
He told the court that there was no-one who could peer review him, referring to Prof El Naschie, because "if you have something new to offer, peer review is dangerous", adding that in such cases "peer review delays progress in science".
Prof El-Naschie asked his witness whether he thought that his (Prof El Naschie) papers were of "poor quality".
Prof Rossler replied: "On the contrary, they were very important and will become more important in the future."
And he added: "You are the most hard-working and diligent scientists I have ever met."
It is a useful to compare Rossler with Neil Turok, Nature's expert witness. Rossler is best known lately for warning that the Large Hadron Collider risked creating black holes that would destroy the world. See El Naschie Watch for this and other information. He is also a self-proclaimed simultaneous submitter, something that for journal editors is almost as bad as plagiarism. A comment on Rossler's claims concludes "To conclude: this text would not pass the referee process in a serious journal".
It seems that it is increasingly difficult to argue that Alexandria University's remarkable scores for research impact in the Times Higher Education World University Rankings were the result of outstanding, excellent or even controversial research.
I am not sure what an honorary editor is but at the moment Rossler is not listed as any sort of editor at the official Chaos Solitons and Fractals home page.
Incidentally, since the trial will presumably turn to the question of El Naschie's affiliations at some point, this page lists him as Founding Editor but does not give any affiliation.
The Politics of Ranking
One of the more interesting aspects of the university ranking business is the way it is used by local politicians to advance their agenda. This is especially obvious in Malaysia where errors and methodological changes have sent local universities bouncing up down the QS rankings. Every rise is proclaimed to be a vindication of government policy while every fall is accompanied by head shaking from the opposition.
This year Universiti Malaya moved into the QS top 200. There is nothing surprising about that: it has been there before. More significant was getting into the Shanghai Academic ranking of World Universities top 500. That is a lot harder but also more likely to reflect real underlying changes. It seems that UM has finally realised that a little bit of encouragement and financial support can produce quite significant results in a short period time.
Patrick Lee, in the blog of opposition leader Lee Kit Siang, comments that:
PETALING JAYA: Malaysia has little to show for its universities despite spending more money on tertiary education than do many other countries.
Malaysian universities lag behind many counterparts in Asia, including those located in neighbouring countries like Thailand and Singapore, according to a World Bank report released today.
“While Malaysia spends slightly more than most countries on its university students, leading Malaysian universities perform relatively poorly in global rankings,” said the report, entitled Malaysia Economic Monitor: Smart Cities.
Citing the Quacquarelli Symonds (QS) World University Rankings 2010, it noted that Universiti Malaya (UM) was ranked 207th worldwide and 29th in Asia.
It also quoted a US News and World 2011 report on the World’s Best Universities, which put UM, Universiti Kebangsaan Malaysia, Universiti Sains Malaysia and Universiti Putra Malaysia at 167th, 279th, 335th and 358th place respectively.
Even more worrying, the World Bank report observed, was the “increasing gap” between Malaysia’s and Singapore’s universities.
It compared UM with the National University of Singapore (NUS), which QS cited as the leading university in Southeast Asia.
“The gap between UM and NUS has been high and generally increasing, especially in the sciences,” the report said.
According to the report, UM and NUS were on par when it came to science and technology in 2005. However, UM has lost out to NUS over the past six years.
The report also said many of Malaysia’s university graduates did not seem to have the skills that would help them get employment.
Firstly, the QS and the US News and World Report rankings are the same. Secondly, it is a lot easier to start a university in Malaysia than in Singapore.
Even so, moving into the Shanghai rankings is a real advance and should be recognised as such.
Update 4 on El Naschie vs Nature
Gervase de Wilde has an article on the case in Inform's Blog. He refers to the Guardian's account and then adds the following comment:
Comment
The case seems to offer ammunition to libel reformers. Even in the absence of the ill-advised and incoherent aspects of his case which were excluded before trial, and of the implicit comparisons of his work to Einstein’s made during the first five days at the High Court, his claim against a venerable and highly respected scientific journal seems a poor substitute for meeting their allegations head on in some form of correspondence or public debate. Moreover, the journal had published the Claimant’s own defence of his methods in running CSF, that he sought to emphasise scientific content above impressive affiliations, in the original article.
A spokesperson for the Libel Reform campaign, speaking to the Guardian, commented that reform can’t come soon enough, since
“Scientists expect publications like Nature to investigate and write about controversies within the scientific community. The threat of libel action is preventing scientific journals from discussing what is good and bad science.”However, the public interest defence argued for by campaigners is one which is already being employed. The BBC reports that Andrew Caldecott QC’s opening statement for the Defendants described their defence as relying on the article being “true, honest opinion and responsible journalism on an issue of public interest”.
As the choice of witnesses indicates, the case does touch on the seemingly incomprehensible branch of physics in which the Claimant has made his academic career. In this respect there is a threat of a libel action stifling academic debate, and a similarity to BCA v Singh 2010 EWCA Civ 350, where opinions expressed in a controversy on what was essentially a scientific matter were at issue. But it is also about the methods he employed in running a publication in the context of a widely recognised system of accreditation and review, and about allegations regarding the professional affiliations which feature on his website. These are the kind of criticisms that might be made about any professional person, and would not necessarily come under scope of a scientific exception for “rigorous debate” on good and bad science urged by campaigners.
It is questionable whether the phrase "academic career" is appropriate since El Naschie has not apparently held any formal permanent academic posts recently unless one counts the award of an Emeritus Professorship by Alexandria University, a strange distinction since there is no sign of the professorship from which he retired.
Friday, November 18, 2011
Update 3 on El Naschie vs Nature
The Guardian has a substantial report on the case by Alo Jha. It seems that El Naschie believes that expert witness Neil Turok is unqualified to understand his work.It is difficult to see how this argument, even if valid, is relevant to the point of whether or not peer review took place.
Should the court decide in favour of El Naschie, it would provide some sort of justification for the methods used in the citations indicator in the Times Higher Education rankings which gave high scores to Alexandria University mainly or partly because of the many citations of papers by El Naschie.
El Naschie is suing Nature as a result of a news article published in 2008, after the scientist's retirement as editor-in-chief of the journal Chaos, Solitons and Fractals. The article alleged that El Naschie had self-published several research papers, some of which did not seem to have been peer reviewed to an expected standard and also said that El Naschie claimed affiliations and honorary professorships with international institutions that could not be confirmed by Nature. El Naschie claims the allegations in the article were false and had damaged his reputation.
On Friday, Nature called Professor Neil Turok, a cosmologist and director of the Perimeter Institute in Canada, as an expert witness to assess some of the work published by El Naschie.
Turok described his expertise as being in cosmology. "I work at the theoretical end of cosmology … my work consists of applying unified theories, such as string theory, to the most difficult questions in cosmology, namely the beginning of the universe or the initial singularity, the moment where everything was at a single point in the conventional description."
In his evidence, Turok said he found it difficult to understand the logic in some of El Naschie's papers. The clear presentation of scientific ideas was an important step in getting an idea accepted, he said. "There are two questions – one is whether the work is clearly presented and readers would be able to understand it. It would be difficult for a trained theoretical physicist to understand [some of El Naschie's papers]. I couldn't understand it and I made a serious attempt to understand it. The second question is about the correctness of the theory and that will be decided by whether it agrees with experiments. Most theories in theoretical physics are speculative – we form a logical set of rules and deductions and we try, ultimately, to test the deductions in experiments. For me, clear presentation is the first thing in the presentation of a theory."
In response, El Naschie pointed out that even Albert Einstein had made mistakes in his publications. "Einstein is the most sloppy scientist ever. He never defined his quantities, he doesn't put in references and he made so many mistakes of mathematics and concepts. He was a very natural man when he explained something to lay people. But Einstein, whom I admire very much because he had imagination and the courage to stand up to the bloody Nazis, Einstein was an extremely sloppy man."
Later in the session, El Naschie accused Turok of having "no idea" about mathematics and being unqualified to assess his work. "If somebody doesn't understand things, it's his own limitation," El Naschie said.
Thursday, November 17, 2011
The Influence of Rankings
Varsity, the student newspaper at Cambridge, suggests that British universities are recruiting staff in order to improve their position in the QS rankings:
Matthew Knight, chairman of Universities HR and the University of Leeds HR director, said: “Within the context of £9,000 fees, many universities have a strategic drive to improve the quality of the student experience.
“Therefore, many are taking the opportunity to improve student staff ratios regardless of the numbers of applicants. So there’s a lot of recruitment going on at some universities, although there’s no specific pattern to this.”
As the QS World University Rankings use student-faculty ratios as the only globally comparable indicator to determine their tables, an increase in employment can be used to promote a university’s image and attract students.
Wednesday, November 16, 2011
Update 2 on El Naschie and Nature
Note that New Scientist describes El Naschie as an "independent physicist". Does this imply that he has no affiliation and that Nature was correct in questioning his claims to academic status?
Update on El Naschie and Nature
The New Scientist has provided some coverage of the trial which is also discussed at El Naschie Watch. On November 15, this item by Chelsea Whyte appeared:
Benjamin De Lacy Costello, a materials scientist at the University of the West of England in Bristol, UK, testified yesterday that when El Naschie was editor, the peer-review process at Chaos, Solitons and Fractals was "frustrating" and unlike that of other journals.
With regard to the dispute over El Naschie's affiliations, Timothy John Pedley, former head of the department of applied mathematics and theoretical physics at the University of Cambridge, said that El Naschie was a visiting scholar with access to libraries and collaborations at the department, but was not an honorary scholar working with the privileges of a professor.
On November 16 this update appeared:
Update: Mohamed El Naschie, a former editor of the scientific journal Chaos, Solitons and Fractals, appeared in London's High Court today for the libel lawsuit he has brought against the scientific journal Nature.
El Naschie is representing himself.During El Naschie's cross-examination of journalist Quirin Schiermeier, who wrote the 2008 article about him, Schiermeier stood by the content of the work, saying, "We wrote the article because you published 58 papers in one year in a journal where you acted as editor-in-chief. That is unusual and potentially unethical."
El Naschie responded that he felt it wasn't unheard of for journals to publish work that isn't peer-reviewed. He also said that his work had been stolen. "We published my work to secure it," he told the court. "Senior people are above this childish, vain practice of peer review."
I am not an expert, but it seems that El Naschie does not appear to dispute any longer that his pattern of self-publication was unusual or that there had been little or no peer review. He is simply claiming that publication was necessary to preempt the theft of his work by rivals and that the absence of peer review was excused by his seniority. Whether that is inconsistent with Nature's comments is, I assume, a matter for the judge to decide.
El Naschie and Nature
The El Naschie vs Nature case is under way at the Royal Courts of Justice in London.
Briefly, Mohamed El Naschie, the former editor of the journal Chaos, Solitons and Fractals, is suing the journal Nature and the writer Quirin Schiermeier for its comments on the journal's publication of many of his own papers.
El Naschie is claiming that he was defamed by the suggestion that his papers were of poor quality and were published without a normal peer review process. He also claims that he had been defamed by the imputation that he had claimed academic affiliations to which he was not entitled.
The case is of vital importance to academic freedom since if successful it would mean that wealthy persons could stifle even the most balanced and temperate comments on scientific and scholarly activities.
It is also of importance to the question of international university ranking since El Naschie's unusual self-publication and self-citation within a short period of time in a field where citations are low allowed Alexandria University to achieve an extraordinarily high score in the 2010 Times Higher Education World University Rankings. Even this year, the university had a n unreasonably high score in the ranking's research impact indicator. If El Naschie were successful in his claim then Times Higher and Thomson Reuters, who collected and analysed the data for the rankings, would be able to argue that they had uncovered a small pocket of excellence.
The case has been covered extensively in El Naschie Watch and has been discussed in the scientific press.
Updates will be provided from time to time.
The El Naschie vs Nature case is under way at the Royal Courts of Justice in London.
Briefly, Mohamed El Naschie, the former editor of the journal Chaos, Solitons and Fractals, is suing the journal Nature and the writer Quirin Schiermeier for its comments on the journal's publication of many of his own papers.
El Naschie is claiming that he was defamed by the suggestion that his papers were of poor quality and were published without a normal peer review process. He also claims that he had been defamed by the imputation that he had claimed academic affiliations to which he was not entitled.
The case is of vital importance to academic freedom since if successful it would mean that wealthy persons could stifle even the most balanced and temperate comments on scientific and scholarly activities.
It is also of importance to the question of international university ranking since El Naschie's unusual self-publication and self-citation within a short period of time in a field where citations are low allowed Alexandria University to achieve an extraordinarily high score in the 2010 Times Higher Education World University Rankings. Even this year, the university had a n unreasonably high score in the ranking's research impact indicator. If El Naschie were successful in his claim then Times Higher and Thomson Reuters, who collected and analysed the data for the rankings, would be able to argue that they had uncovered a small pocket of excellence.
The case has been covered extensively in El Naschie Watch and has been discussed in the scientific press.
Updates will be provided from time to time.
Tuesday, November 15, 2011
The THE Subject Rankings
The ranking seasons has drawn to a close, or at least it will when we have digested the feasibility report from the European Commission's U-Multirank project. Meanwhile, to tie up some loose ends, here are the top 3 from each of THE's subject group rankings.
Engineering and Technology
1. Caltech
2. MIT
3. Princeton
Arts and Humanities
1. Stanford
2. Harvard
3. Chicago
Clinical, Pre-Clinical and Health
1. Oxford
2. Harvard
3. Imperial College London
Life Sciences
1. Harvard
2. MIT
3. Cambridge
Physical Sciences
1. Caltech
2. Princeton
3. UC Berkeley
Social Sciences
To be posted on the 17th of November.
The ranking seasons has drawn to a close, or at least it will when we have digested the feasibility report from the European Commission's U-Multirank project. Meanwhile, to tie up some loose ends, here are the top 3 from each of THE's subject group rankings.
Engineering and Technology
1. Caltech
2. MIT
3. Princeton
Arts and Humanities
1. Stanford
2. Harvard
3. Chicago
Clinical, Pre-Clinical and Health
1. Oxford
2. Harvard
3. Imperial College London
Life Sciences
1. Harvard
2. MIT
3. Cambridge
Physical Sciences
1. Caltech
2. Princeton
3. UC Berkeley
Social Sciences
To be posted on the 17th of November.
Monday, November 07, 2011
Conference in Shanghai
I hope to post something in a day or two on the recent World Class Universities conference in Shanghai. Meanwhile, there is an interesting comment by Alex Usher of Higher Education Strategy Associates, a Canadian consulting firm.
"In discussions like this the subject of rankings is never far away, all the more so at this meeting because its convenor, Professor Nian Cai Liu, is also the originator of the Academic Ranking of World Universities, also known as the Shanghai Rankings. This is one of three main competing world rankings in education, the others being the Times Higher Education Supplement (THES) and the QS World Rankings.
The THES and QS rankings are both commercially-driven exercises. QS actually used to do rankings for THES, but the two parted ways a couple of years ago when QS’s commercialism was seen to have gotten a little out of hand. After the split, THES got a little ostentatious about wanting to come up with a “new way” of doing rankings, but in reality, the two aren’t that different: they both rely to a considerable degree on institutions submitting unverified data and on surveys of “expert” opinion. Shanghai, on the other hand, eschews surveys and unverified data, and instead relies entirely on third-party data (mostly bibliometrics).
In terms of reliability, there’s really no comparison. If you look at the correlation between the indicators used in each of the rankings, THES and QS are very weak (meaning that the final results are highly sensitive to the weightings), while the Shanghai rankings are very strong (meaning their results are more robust). What that means is that, while the Shanghai rankings are an excellent rule-of-thumb indicator of concentrations of scientific talent around the world, the QS and THES rankings in many respects are simply measuring reputation.
(I could be a bit harsher here, but since QS are known to threaten academic commentators with lawsuits, I’ll be circumspect.)
Oddly, QS and THES get a lot more attention in the Canadian press than do the Shanghai rankings. I’m not sure whether this is because of a lingering anglophilia or because we do slightly better in those rankings (McGill, improbably, ranks in the THES’s top 20). Either way, it’s a shame, because the Shanghai rankings are a much better gauge of comparative research output, and with its more catholic inclusion policy (500 institutions ranked compared to the THES’s 200), it allows more institutions to compare themselves to the best in the world – at least as far as research is concerned. "
Some technical points. First, Times Higher Education Supplement changed its name to Times Higher Education when it converted to a magazine format in 2008.
Second, the Shanghai rankings are not entirely free from commercial pressures themselves although that has probably had the laudable effect of maintaining a stable methodology since 2003.
Third, both THE and QS accept data from institutions but both claim to have procedures to validate them. Also, the Shanghai rankings do include data from government agencies in their productivity per capita criterion and in some places that might not be any more valid than data from universities.
Fourth, until recently there has been a significant difference in the expert opinion used by THE and by QS. Most of QS's survey respondents were drawn from the mailing lists of the Singapore- and London- based academic publishers, World Scientific, while THE's are drawn from those who have published papers in the ISI indexes. All other things being equal, we would expect THE's respondents to be more expert. This year the difference has been reduced somewhat as QS are getting most of their experts from the Mardev lists supplemented by a sign up facility.
Fifth, although THE publish a list of 200 universities in print and on their site, there is a fairly easily downloadable iphone app available that lists 400 universities.
The most important point though is the question of consistency. It is quite true that the various indicators in the Shanghai rankings correlate quite closely or very closely with one another (.46 to .90 in 2011 according to a conference paper by Ying Chen and Yan Wu of the Shanghai Center for World- Class Universities) while some of those in the QS and THE rankings have little or no relation to one another. However, it could be argued that if two indicators show a high correlation with one another then they are to some extent measuring the same thing and one of them is redundant. Still, that is probably better than indicators which statistically have little to do with one another.
What is more important perhaps is the consistency from one year to another. The main virtue of the Shanghai rankings is that changes in position can be assumed to reflect actual real world changes whereas those in the THE and QS rankings could easily be the result of methodological changes or, in the case of THE, omissions or inclusions.
I hope to post something in a day or two on the recent World Class Universities conference in Shanghai. Meanwhile, there is an interesting comment by Alex Usher of Higher Education Strategy Associates, a Canadian consulting firm.
"In discussions like this the subject of rankings is never far away, all the more so at this meeting because its convenor, Professor Nian Cai Liu, is also the originator of the Academic Ranking of World Universities, also known as the Shanghai Rankings. This is one of three main competing world rankings in education, the others being the Times Higher Education Supplement (THES) and the QS World Rankings.
The THES and QS rankings are both commercially-driven exercises. QS actually used to do rankings for THES, but the two parted ways a couple of years ago when QS’s commercialism was seen to have gotten a little out of hand. After the split, THES got a little ostentatious about wanting to come up with a “new way” of doing rankings, but in reality, the two aren’t that different: they both rely to a considerable degree on institutions submitting unverified data and on surveys of “expert” opinion. Shanghai, on the other hand, eschews surveys and unverified data, and instead relies entirely on third-party data (mostly bibliometrics).
In terms of reliability, there’s really no comparison. If you look at the correlation between the indicators used in each of the rankings, THES and QS are very weak (meaning that the final results are highly sensitive to the weightings), while the Shanghai rankings are very strong (meaning their results are more robust). What that means is that, while the Shanghai rankings are an excellent rule-of-thumb indicator of concentrations of scientific talent around the world, the QS and THES rankings in many respects are simply measuring reputation.
(I could be a bit harsher here, but since QS are known to threaten academic commentators with lawsuits, I’ll be circumspect.)
Oddly, QS and THES get a lot more attention in the Canadian press than do the Shanghai rankings. I’m not sure whether this is because of a lingering anglophilia or because we do slightly better in those rankings (McGill, improbably, ranks in the THES’s top 20). Either way, it’s a shame, because the Shanghai rankings are a much better gauge of comparative research output, and with its more catholic inclusion policy (500 institutions ranked compared to the THES’s 200), it allows more institutions to compare themselves to the best in the world – at least as far as research is concerned. "
Some technical points. First, Times Higher Education Supplement changed its name to Times Higher Education when it converted to a magazine format in 2008.
Second, the Shanghai rankings are not entirely free from commercial pressures themselves although that has probably had the laudable effect of maintaining a stable methodology since 2003.
Third, both THE and QS accept data from institutions but both claim to have procedures to validate them. Also, the Shanghai rankings do include data from government agencies in their productivity per capita criterion and in some places that might not be any more valid than data from universities.
Fourth, until recently there has been a significant difference in the expert opinion used by THE and by QS. Most of QS's survey respondents were drawn from the mailing lists of the Singapore- and London- based academic publishers, World Scientific, while THE's are drawn from those who have published papers in the ISI indexes. All other things being equal, we would expect THE's respondents to be more expert. This year the difference has been reduced somewhat as QS are getting most of their experts from the Mardev lists supplemented by a sign up facility.
Fifth, although THE publish a list of 200 universities in print and on their site, there is a fairly easily downloadable iphone app available that lists 400 universities.
The most important point though is the question of consistency. It is quite true that the various indicators in the Shanghai rankings correlate quite closely or very closely with one another (.46 to .90 in 2011 according to a conference paper by Ying Chen and Yan Wu of the Shanghai Center for World- Class Universities) while some of those in the QS and THE rankings have little or no relation to one another. However, it could be argued that if two indicators show a high correlation with one another then they are to some extent measuring the same thing and one of them is redundant. Still, that is probably better than indicators which statistically have little to do with one another.
What is more important perhaps is the consistency from one year to another. The main virtue of the Shanghai rankings is that changes in position can be assumed to reflect actual real world changes whereas those in the THE and QS rankings could easily be the result of methodological changes or, in the case of THE, omissions or inclusions.
Friday, October 28, 2011
An Error
This replaces an earlier post.
Last year Times Higher Education admitted to an error involving Monash University and the University of Adelaide
Also, after the launch of the World University Rankings 2010 it became apparent that, owing to a data processing error, the ranking positions of two Australian universities in the top 200 list were incorrect — the University of Adelaide and Monash University.
Both universities remain in the top 1 per cent of world universities.
This year, a representative of Adelaide commented on the error:
Adelaide's DVCR Mike Brooks said it had been "disconcerting'' that there had been a data processing error last year in the first iteration of the revised rankings since their split from QS. "It certainly raises further questions about the credibility of the rankings,'' Professor Brooks said.
"Based on our own analysis we believe that we have a similar ranking this year to that of 2010. The shift in position is attributed to the error in the processing last year, ongoing changes in THE methodology and increased competition.''
"I think the students and the wider community are able to judge for themselves. As South Australia's leading research-university and only member of the Group of Eight, I know that we are in an incredibly strong position for the future.''
Adelaide's fall seems to have been due very largely to a massive fall in the score for research impact. How much of this was due to the correction of the 2010 error, how much to changes in methodology and how much to the inherent instability of the normalisation procedure is not clear
Monday, October 17, 2011
GLOBAL: Despite ranking changes, questions persist
My article on the Times Higher Education World University Rankings can be accessed at University World News
The international university ranking scene is starting to look like the heavyweight boxing division. Titles are proliferating and there is no longer an undisputed champion of the world. Caltech has just been crowned top university by Times Higher Education and Thomson Reuters, their data collectors and analysts, while QS have put Cambridge in first place. Over at Webometrics, MIT holds the number one spot. But Harvard has the consolation of remaining the top university in the Scimago and HEEACT rankings as well as the Academic Ranking of World Universities, ARWU, published by Shanghai Jiao Tong University.
Read here
My article on the Times Higher Education World University Rankings can be accessed at University World News
The international university ranking scene is starting to look like the heavyweight boxing division. Titles are proliferating and there is no longer an undisputed champion of the world. Caltech has just been crowned top university by Times Higher Education and Thomson Reuters, their data collectors and analysts, while QS have put Cambridge in first place. Over at Webometrics, MIT holds the number one spot. But Harvard has the consolation of remaining the top university in the Scimago and HEEACT rankings as well as the Academic Ranking of World Universities, ARWU, published by Shanghai Jiao Tong University.
Read here
Sunday, October 09, 2011
Rising Stars of the THE - TR Rankings
These are some of the universities that have risen significantly in the rankings compared to last year.
Europe
Karolinska Institute
Munich
LSE
Zurich
Leuven
Wageningen
Leiden
Uppsala
Sheffield
Humboldt
USA
UC Davis
Minnesota
PennState
Michigan State
Australia
Monash
Asia
Osaka
Tohoku
Caltech in First Place
The big news of the 2011 THE - TR rankings is that Caltech has replaced Harvard as the world's top university. So how exactly did they do it?
According to the Times Higher iPad apps for this year and last (easily downloadable from the rankings page), Harvard's total score fell from 96.1 to 93.9 and Caltech's from 96.0 to 94.8, turning a 0.1 Harvard lead into one of 0.9 for Caltech.
Harvard continued to do better than Caltech in two indicators, with 95 .8 for teaching and 67.5 for international orientation compared to 95.7 and 56.0 for Caltech.
Caltech is much better than Harvard in industry income - innovation but that indicator has a weighting of only 2.5 %.
Harvard's slight lead in the research indicator has turned into a slight lead of 0.8 for Caltech.
Caltech is still ahead for citations but Harvard caught up a bit, narrowing the lead to 0.1.
So, it seems that what made the difference was the research indicator. it seems unlikely that Caltech could overcome Harvard's massive lead in reputation for research and postgraduate teaching: last year it was 100 compared with 23.5. That leaves us with research income per faculty.
According to Phil Baty :
"Harvard reported funding increases that are similar in proportion to those of many other universities, whereas Caltech reported a steep rise (16 per cent) in research funding and an increase in totalinstitutional income."
This seems generally compatible with Caltech's 2008-2009 financial statement according to which:
Before accounting for investment losses, total unrestricted revenues increased 6.7% including JPL, and 14.0% excluding JPL
and
Research awards in FY 2009 reached an all-time high of $357 million, including $29 million of funds secured from the federal stimulus package. Awards from federal sponsors increased by 34.4%, while awards from nonfederal sponsors increased by 20.7%. We also had a good year in terms of private giving, as donors continue to recognize the importance of the research and educational efforts of our outstanding faculty and students.
It seems that research income is going to be the tie-breaker at the top of the THE - TR rankings. This might not be such a good thing. Income is an input. It is not a product, although universities everywhere apparently think so. There are negative backwash effects coming if academics devote their energies to securing grants rather than actually doing research.
Update on Alexandria
Elnaschiewatch reports that Hend Hanafi, President of Alexandria University, has resigned following prolonged student protests.
Apparently she was under fire because of her links to the old regime but one wonders whether her university's apparent fall of nearly 200 places in the THE - TR rankings gave her a final push. If so, we hope that Times Higher will send a letter of apology for unrealistically raising the hopes of faculty and students.
Elnaschiewatch reports that Hend Hanafi, President of Alexandria University, has resigned following prolonged student protests.
Apparently she was under fire because of her links to the old regime but one wonders whether her university's apparent fall of nearly 200 places in the THE - TR rankings gave her a final push. If so, we hope that Times Higher will send a letter of apology for unrealistically raising the hopes of faculty and students.
Meanwhile over in Alexandria
One of the strangest results of the 2010 THE - TR rankings was the elevation of Alexandria University in Egypt to the improbable status of fourth best university in the world for research impact and 147th overall. It turned out that this was almost entirely the work of precisely one marginal academic figure, Mohamed El Naschie, former editor of the journal Chaos Solitons and Fractals, whose worked was copiously cited by himself, other authors in his journal and those in an Israeli - published journal (now purchased by De Gruyter) of which he was an editor.
The number of citations collected by El Naschie was not outrageously high but it was much higher than usual for his discipline and many of them were within a year of publication. This meant that El Naschie and Alexandria University received massive credit for his citations since Thomson Reuters' normalisation system meant comparison with the international average in a field where citations are low especially in the first year of publication.
Alexandria was not the only university to receive an obviously inflated score for research impact. Hong Kong Baptist University received a score of 97.6 and Bilkent one of 95.7, although in those two cases it seems that the few papers that contributed to these scores did have genuine merit.
It should be remembered that the citation scores were averages and that a few highly cited papers could have a grossly disproportionate effect if the total number of published papers was low.
This year Thomson Reuters went to some length to reduce the impact of a few highly cited papers. They have to some extent succeeded. Alexandria's score is down to 61.4 for citations (it is in 330th place overall), Bilkent's to 60.8 (222nd place overall) and HKBU's to 59.7 (290th place overall).
These scores are not as ridiculous as those of 2010 but they are still unreasonable. Are we really expected to believe that these schools have a greater research impact than the University of Sydney, Kyoto University, the London School of Economics, Monash University and Peking University who all have scores in the fifties for this indicator?
I for one cannot believe that a single paper or a few papers, no matter how worthwhile, can justify inclusion in the top 300 world universities.
There is another problem. Normalisation of citations by year is inherently unstable. One or two papers in a low citation discipline cited within a year of publication will give a boost to the citations indicator score but after a year their impact diminishes because the citations are now coming more than a year after publication.
Alexandria's score was due to fall anyway because El Naschie has published vary little lately so his contribution to the citations score has fallen whatever methodological changes were introduced. And if he ever starts publishing again?
Also, if Thomson Reuters are normalising by field across the board, this rises the possibility that universities will be able to benefit by simply reclassifying research grants, moving research centres fromone field to another, manipulating abstracts and key words and so on.
One of the strangest results of the 2010 THE - TR rankings was the elevation of Alexandria University in Egypt to the improbable status of fourth best university in the world for research impact and 147th overall. It turned out that this was almost entirely the work of precisely one marginal academic figure, Mohamed El Naschie, former editor of the journal Chaos Solitons and Fractals, whose worked was copiously cited by himself, other authors in his journal and those in an Israeli - published journal (now purchased by De Gruyter) of which he was an editor.
The number of citations collected by El Naschie was not outrageously high but it was much higher than usual for his discipline and many of them were within a year of publication. This meant that El Naschie and Alexandria University received massive credit for his citations since Thomson Reuters' normalisation system meant comparison with the international average in a field where citations are low especially in the first year of publication.
Alexandria was not the only university to receive an obviously inflated score for research impact. Hong Kong Baptist University received a score of 97.6 and Bilkent one of 95.7, although in those two cases it seems that the few papers that contributed to these scores did have genuine merit.
It should be remembered that the citation scores were averages and that a few highly cited papers could have a grossly disproportionate effect if the total number of published papers was low.
This year Thomson Reuters went to some length to reduce the impact of a few highly cited papers. They have to some extent succeeded. Alexandria's score is down to 61.4 for citations (it is in 330th place overall), Bilkent's to 60.8 (222nd place overall) and HKBU's to 59.7 (290th place overall).
These scores are not as ridiculous as those of 2010 but they are still unreasonable. Are we really expected to believe that these schools have a greater research impact than the University of Sydney, Kyoto University, the London School of Economics, Monash University and Peking University who all have scores in the fifties for this indicator?
I for one cannot believe that a single paper or a few papers, no matter how worthwhile, can justify inclusion in the top 300 world universities.
There is another problem. Normalisation of citations by year is inherently unstable. One or two papers in a low citation discipline cited within a year of publication will give a boost to the citations indicator score but after a year their impact diminishes because the citations are now coming more than a year after publication.
Alexandria's score was due to fall anyway because El Naschie has published vary little lately so his contribution to the citations score has fallen whatever methodological changes were introduced. And if he ever starts publishing again?
Also, if Thomson Reuters are normalising by field across the board, this rises the possibility that universities will be able to benefit by simply reclassifying research grants, moving research centres fromone field to another, manipulating abstracts and key words and so on.
Friday, October 07, 2011
Who else is down ?
Just looking at the top 200 of the THE rankings, these universities have fallen quite a bit.
University of North Carolina Chapel Hill
Sydney
Ecole Normale Superieure
Ecole Polytechnique
Trinity College Dublin
University College Dublin
William and Mary College
University of Virginia
Just looking at the top 200 of the THE rankings, these universities have fallen quite a bit.
University of North Carolina Chapel Hill
Sydney
Ecole Normale Superieure
Ecole Polytechnique
Trinity College Dublin
University College Dublin
William and Mary College
University of Virginia
Asian Decline?
The Shanghai rankings have shown that universities in Korea, China (including Taiwan and Hong Kong) and the Middle East have been steadily advancing over the years. Did they get it wrong?
The latest Times Higher Education -Thomson Reuters rankings appear to prove that Asian universities have inexplicably collapsed over the last year. Tokyo has dropped from 26th to 30th place. Peking has fallen twelve places to 49th. Pohang University of Science and Technology and the Hong Kong University of Science and Technology have slipped out of the top fifty. Bilkent and Hong Kong Baptist University are way down.The decline of China University of Science and Technology is disastrous, from 49th to 192nd. Asian universities are going to be dangerous places for the next few days with students and teachers dodging university administrators jumping out of office windows.
Of course, massive declines like this do not reflect reality: they are simply the result of the methodological changes introduced this year.
Anyone accessing a ranking site or downloading an iPad app should be made to click on a box reading "I understand that the methodological changes in the rankings mean that comparison with last year's ranking is pointless and I promise not to issue a public statement or say anything to anyone until I have had a cup of tea and I have made sure that everybody else understands this."
Thursday, October 06, 2011
New Arrivals in the THE Top 200.
Every time a new ranking is published there are cries for the dismissal or worse of vice-chancellors or presidents who allowed their universities to lose ground. There will no doubt be more demands as the results of this year's THE rankings are digested. This will be very unjust since there are reasons why universities might take a tumble that have nothing to do with any decline in quality.
First, Thomas Reuters, THE's data collectors, have introduced several methodological changes. In the top 20 or 30 these might not mean very much but lower down the effect could be very large.
Second, rankers sometimes make mistakes and so do those who collect data for institutions.
Third, many new universities have taken part this year. I counted thirteen just in the top 200 and there are certainly many more in the 200s and300s. A university ranked 200 last year would lose 13 places even if it had exactly the same relative score.
The thirteen newcomers are Texas at Austin, Rochester, Hebrew University of Jerusalem, University of Florida, Brandeis, Chinese University of Hong Kong, Nijmegan, Medical University of South Carolina, Louvain, Universite Paris Diderot vii, Queen's University, Canada, Sao Paulo, Western Australia.
Highlights of the THE rankings
Some interesting results.
57. Ohio State University
103. Cape Town
107 Royal Holloway
149. Birkbeck
184. Iowa State
197. Georgia Health Sciences University
201-225. Bilkent
201-225 University of Medicine and Dentistry of New Jersey
226-250 Creighton University USA
226-250 Tokyo Metropolitan
251-275 Wayne State
276-300 University of Crete
276-300 University of Iceland
276-300 Istanbul Technical University
276-300 Queensland University of Technology
276-300 Tokyo Medical and Dental University
301-350 Alexandria
301-350 Aveiro University
301-350 Hertfordshire
301-350 Plymouth University, UK
301-350 Sharif University of Technology
301-350 National University of Ireland, Maynooth
301-350 Taiwan Ocean University
301-350 Old Dominion University, USA
Some interesting results.
57. Ohio State University
103. Cape Town
107 Royal Holloway
149. Birkbeck
184. Iowa State
197. Georgia Health Sciences University
201-225. Bilkent
201-225 University of Medicine and Dentistry of New Jersey
226-250 Creighton University USA
226-250 Tokyo Metropolitan
251-275 Wayne State
276-300 University of Crete
276-300 University of Iceland
276-300 Istanbul Technical University
276-300 Queensland University of Technology
276-300 Tokyo Medical and Dental University
301-350 Alexandria
301-350 Aveiro University
301-350 Hertfordshire
301-350 Plymouth University, UK
301-350 Sharif University of Technology
301-350 National University of Ireland, Maynooth
301-350 Taiwan Ocean University
301-350 Old Dominion University, USA
Wednesday, October 05, 2011
THE Rankings: Caltech Ousts Harvard
This is from the Peninsula in Qatar
LONDON: US and British institutions once again dominate an annual worldwide league table of universities published yesterday, but there is a fresh name at the top, unseating long-time leader Harvard.
California Institute of Technology (Caltech) knocked the famous Massachusetts institution from the summit of the Times Higher Education (THE) league table for the first time in eight years, with US schools claiming 75 of the top 200 places.
Next is Britain, which boasts 32 establishments in the top 200, but an overhaul in the way in which the country’s universities are funded has raised concerns over its continuing success.
Asia’s increasing presence in the annual table has stalled, with 30th placed University of Tokyo leading the continent’s representation.
China’s top two universities hold on to their elite status, but no more institutions from the developing powerhouse managed to break into the top 200.
THE attributed Caltech’s success to “consistent results across the indicators and a steep rise in research funding”.
Subscribe to:
Posts (Atom)