Sunday, January 15, 2012

Research Fraud in the UK

AN article in Times Higher Education by Jack Grove indicates that there is a large amount of research fraud going on in UK universities, although the comments raise questions about the validity of the study.

I am wondering if there is any comparative international data available.
Primary School League Tables

The craze for rankings continues unabated. There is now a League Table for English primary schools.


The league tables show the percentage of 11-year-olds in each school reaching Level 4 – the standard expected for their age group – in both English and maths at primary school.
Officially, this means they can spell properly, start to use grammatically complex sentences and employ joined up handwriting in English. In maths, they should be able to multiply and divide whole numbers by 10 or 100 and be able to use simple fractions and percentages.

Pupils exceeding this standard are awarded a higher Level 5.Data for individual schools also shows three other measures: average points score, value-added and pupil progress.

Saturday, January 14, 2012

Who says college isn't worth it?

There is a web site called SeekingArrangement which puts sugar daddies and sugar babies in touch with another. Apparently, large numbers of college graduates are signing up in the latter category, perhaps because of increasing difficulties in paying off student loans.

There is now a ranking of the top 20 colleges among sugar baby sign ups. I have doubts about the validity of the ranking. All that is necessary to be a certified college sugar baby, and get three times as many enquiries from sugar daddies (it's good to know that American society still values education), is an edu email address.

New York University might be number 1 because its tuition fees are so high or job prospects for graduates so meagre or maybe because the sugar daddies are in New York.

Here are the top twenty:

1. New York University (NYU) -- 185
2. University of Georgia -- 155
3. University of Phoenix -- 144
4. Tulane University -- 129
5. Temple University -- 113
6. Virginia Community College -- 108
7. University of Southern Florida -- 93
8. Arizona State University -- 85
9. Michigan State University -- 81
10. Ivy Tech Community College -- 78
11. Georgia State University -- 74
12. University of Wisconsin -- 73
13. Penn State University -- 72
14. University of Central Florida -- 67
15. Kent University -- 65
16. Maricopa Community College -- 63
17. Indiana University -- 62
18. University of California, Berkeley -- 61
19. The Art Institutes -- 60
20. Florida International University -- 59

Please note that I found this story via the Chronicle of Higher Education.
Online Education Rankings

US News has announced its rankings of American online education programs. There are six categories: Bachelor's, Business, Education, Engineering, Info tech and Nursing. Within each category there are rankings for faculty credentials and training, student services and technology, student engagement and assessment and, except for bachelor's, admissions selectivity. For bachelor programs the top universities are:

Faculty Credentials and Training
Westfield State University, Massachusetts

Student Engagement and Assessment  
Bellevue University, Nebraska

Student Services and Technology
Arizona State University, Tempe

Wednesday, January 11, 2012

The end of the university as we know it?


MIT has already been putting its course materials online for anyone to access free of charge. Now they are going a step further.

"MIT today announced the launch of an online learning initiative internally called “MITx.” MITx will offer a portfolio of MIT courses through an online interactive learning platform that will:
  • organize and present course material to enable students to learn at their own pace
  • feature interactivity, online laboratories and student-to-student communication
  • allow for the individual assessment of any student’s work and allow students who demonstrate their mastery of subjects to earn a certificate of completion awarded by MITx
  • operate on an open-source, scalable software infrastructure in order to make it continuously improving and readily available to other educational institutions.
MIT expects that this learning platform will enhance the educational experience of its on-campus students, offering them online tools that supplement and enrich their classroom and laboratory experiences. MIT also expects that MITx will eventually host a virtual community of millions of learners around the world.

There are a lot of questions that come to mind. Will students be assessed according to the same standards as conventional MIT students? If someone accumulates sufficient certificates of completion will they be entitled to an MITx degree? What will happen if employers and graduate school  start accepting MITx certificates as equivalent to standard academic credentials? If so, will MIT be able to resist the temptation to start charging hefty fees for a certificate.

MIT may, perhaps unwittingly,  have started a process that will end with universities becoming something very different.

Monday, January 02, 2012

How Did I Miss This?

The blog Registrarism has discovered a fascinating article, published in 2002 in Higher Education Quarterly, that compares university league tables (that is British university rankings) with the football (soccer to Americans) league tables.

Sunday, December 18, 2011

Leiden Ranking: Many Ways to Rate Research

My article on the Leiden Rankings in University World News can be found here.

 It looks as though a two-tier international university ranking system is emerging.

At the top we have the 'big three', Shanghai's Academic Ranking of World Universities, the QS World University Rankings and, since 2010, the Times Higher Education World University Rankings.

These receive massive attention from the media, are avidly followed by academics, students and other stakeholders and are often quoted in promotional literature. Graduation from a university included in these has even been proposed as a requirement for immigration.

Then we have the rankings by SCImago and Webometrics, both from Spain, the Performance Ranking of Scientific Papers for World Universities produced by the Higher education Evaluation and Accreditation Council of Taiwan, and the Leiden Ranking, published by the Centre for Science and Technology Studies at Leiden University.

These rankings get less publicity but are technically very competent and in some ways more reliable than the better-known rankings.

Wednesday, December 07, 2011

Update 6 on El Naschie vs Nature

There have been no reports for several days and the trial is now over. There will be a judgement in January.
What to do about the research bust

Mark Bauerlein has an article in the Chronicle of Higher Education on the disparity between the extraordinary effort and intelligence poured into scholarly writing in the humanities and the meager attention such writing receives.

"I devised a study of literary research in four English departments at public universities—the University of Georgia, the University at Buffalo, the University of Vermont, and the University of Illinois at Urbana-Champaign—collecting data on salaries, books and articles published, and the reception of those works. The findings:
  • Those universities pay regular English faculty, on average, around $25,000 a year to produce research. According to the faculty handbooks, although universities don't like to set explicit proportions, research counts as at least one-third of professors' duties, and we may calculate one-third of their salaries as research pay. This figure does not include sabbaticals, travel funds, and internal grants, not to mention benefits, making the one-third formula a conservative estimate.
  • Professors in those departments respond diligently, producing ample numbers of books and articles in recent years. At Georgia, from 2004 to 2009, current faculty members produced 22 authored books, 15 edited books, and 200 research essays. The award of tenure didn't produce any drop-off in publication, either. Senior professors continue their inquiries, making their departments consistently relevant and industrious research centers.
  • Finally, I calculated the impact of those publications by using Google Scholar and my own review of books published in specific areas to count citations. Here the impressive investment and productivity appear in sobering context. Of 13 research articles published by current SUNY-Buffalo professors in 2004, 11 of them received zero to two citations, one had five, one 12. Of 23 articles by Georgia professors in 2004, 16 received zero to two citations, four of them three to six, one eight, one 11, and one 16. "
Bauerlain suggests that these limited citation counts are telling us something, that talented scholars might find better things to do and that society might direct resources elsewhere.

The QS World University Rankings would apparently agree. Their citations indicator simply counts the total number of citations and divides it by the total number of faculty. This is a very crude measure, especially since it counts the current number of faculty but then counts the citations to articles written over a five year period. Any university seeking a boost in the QS rankings could simply axe a few English, history and philosophy specialists and replace them with oncologists and engineers. True, the world would lose studies about Emily Dickinson's Reluctant Ecology of Place, cited once according to Google Scholar, or Negotiations of Homoerotic Tradition in Paradise Regained, but if this was accompanied by even a small advance in cancer treatment, who would really care? There would be an even better effect on the Shanghai rankings which do not not count publications or citations in the humanities but still include the faculty in their productivity indicator.

But there are those who would argue that while disciplines go about citing differently they must be regarded as being on the same level in all other respects. Thomson Reuters, who collect the data for the Times Higher Education rankings, now normalise their data so that citations in a specific discipline in a specific country in a specific year are benchmarked against the average for that discipline in that country in that year. That would mean that the article by the Buffalo professors with five citations might look quite good.

I have a suggestion for those professors of English and other disciplines which hardly anyone seems to read anymore. Go to some Central Asian or East African republic where the papers in your field get only a few citations: the next article you write with its miserable handful of citations will be well above average for that country and your new university will suddenly perform well in the Times Higher rankings. Just make sure that your employer produces two hundred papers a year altogether.

Friday, December 02, 2011

European Universities and Rankings

Research Trends, the newsletter from Scopus,  reports on  a conference of European universities that discussed international rankings. The participants found positive aspects to rankings but also had criticisms:

Going through the comparison of the various methodologies, the report details what is actually measured, how the scores for indicators are measured, and how the final scores are calculated — and therefore what the results actually mean.
The first criticism of university rankings is that they tend to principally measure research activities and not teaching. Moreover, the ‘unintended consequences’ of the rankings are clear, with more and more institutions tending to modify their strategy in order to improve their position in the rankings instead of focusing on their main missions.
For some ranking systems, lack of transparency is a major concern, and the QS World University Ranking in particular was criticized for not being sufficiently transparent.
The report also reveals the subjectivity in the proxies chosen and in the weight attached to each, which leads to composite scores that reflect the ranking provider’s concept of quality (for example, it may be decided that a given indicator may count for 25% or 50% of overall assessment score, yet this choice reflects a subjective assessment of what is important for a high-quality institute). In addition, indicator scores are not absolute but relative measures, which can complicate comparisons of indicator scores. For example, if the indicator is number of students per faculty, what does a score of, say, 23 mean? That there are 23 students per faculty member? Or does it mean that this institute has 23% of the students per faculty compared with institutes with the highest number of students/faculty? Moreover, considering simple counts or relative values is not neutral. As an example, the Academic Ranking of World Universities ranking does not take into consideration the size of the institutions.

I am not sure these criticisms are entirely fair. It seems that the weighting of the various indicators in the Times Higher Education rankings emerged from a lot of to and fro-ing between various stakeholders and advisers. In the end, far too much weighting was given to citations but that is not quite the same as assigning arbitrary or subjective values.

The Shanghai rankings do have an indicator, productivity per capita , that takes  faculty size into account although it is only ten per cent of the total ranking. The problem here is that faculty in the humanities are counted but not their publications.

I am not sure why QS is being singled out with regard to transparency. The THE rankings are also, perhaps in a different way, quite opaque. Aggregate scores are given for teaching environment, research and international orientation without indicating the scores that make up these criteria.

So what is to be done?


The EUA report makes several recommendations for ranking-makers, including the need to mention what the ranking is for, and for whom it is intended. Among the suggestions to improve the rankings, the following received the greatest attention from the audience:
  1. Include non-journal publications properly, including books, which are especially important for social sciences and the arts and humanities;
  2. Address language issues (is an abstract available in English, as local language versions are often less visible?);
  3. Include more universities: currently the rankings assess only 1–3% of the 17,000 existing universities worldwide;
  4. Take into consideration the teaching mission with relevant indicators.

The first of these may become feasible now that Thomson Reuters has a book citation index. The second and third are uncontroversial. The fourth is very problematical in many ways.

The missing indicator here is student quality. To be very blunt, universities can educate and instruct students but they can do very little to make them brighter.  A big contribution to any university ranking would be a comparison of the relative cognitive ability of its students. That, however, is a goal that requires passing through many minefields.

Thursday, December 01, 2011

Diversity and Rankings

Robert Morse, director of data research at the US News and World Report discusses the question of whether "diversity" should be included in the ranking of American law schools.

"I was one of speakers on the "Closing Plenary: Reforming U.S. News Rankings to Include Diversity" panel, which discussed many of the issues pertaining to whether U.S. News should add a measure of diversity directly into the Best Law Schools rankings. I pointed out that U.S. News believes diversity is important and that is why we all ready publish a separate law school diversity index.

Our current index identifies law schools where law students are most and least likely to encounter classmates from a different racial or ethnic group. However, the current index does not measure how successful each law school is at meeting a diversity goal or benchmark at the school, state, local, or national level. It also gives schools enrolling one ethnic group a low score, though that school's enrolment may match its state's ethnic population or the school may be a Historically Black College or University. It's for these reasons the current index would not be appropriate to add into the rankings".

Diversity here does not mean diversity of ideology, religion, class, politics or nationality. It simply means the numbers of recognised minorities, mainly African-Americans, Hispanics and Asian Americans.

It is interesting to look at the diversity index and to see the likely effect of including diversity in the law school rankings. The most diverse law school is the University of Hawaii. The University of the District of Columbia and Florida International University also get high scores. Low scorers include Harvard, Yale and UCLA.

Somehow, I do not think that an indicator that benefited Florida International University at the expense of Harvard would add to the credibility of these rankings.

Unless it can be demonstrated that there is something magically transforming about the statistical profile of a law school reflecting that of its city, state, or nation or future nation, this proposal does not sound like a very good idea.
The Utility of Rankings

Another advantage of a good performance in international university rankings is that graduates will be able to get into Russian postgraduate programs (if your university is in a G8 country).


Russia’s education ministry is currently drawing up a list of foreign universities whose qualifications will be recognized.

The list will include only universities located within the G8 countries that enter the top 300 in the Academic Ranking of World Universities or the QS World University Rankings. Officials say there will be around 300 institutions meeting the criteria.

The reform is intended to attract more students to take part in Russian MA and PhD programs.

Saturday, November 26, 2011

Update 5 on El Naschie vs Nature

The BBC has another piece on the Nature case by Pallab Ghosh. It seems that El Naschie is now admitting that his papers were not peer reviewed but argues that this was because he had no peers who could review them:


He said that he would discuss his papers with fellow scientists, and only when he thought that they were of a sufficiently high standard would he publish them. "I am too arrogant and have too much self-respect to allow a bad paper to pass through," he said.
Prof El Naschie called one witness, Prof Otto Rossler - an honorary editor of Chaos, Solitons and Fractals.
He told the court that there was no-one who could peer review him, referring to Prof El Naschie, because "if you have something new to offer, peer review is dangerous", adding that in such cases "peer review delays progress in science".
Prof El-Naschie asked his witness whether he thought that his (Prof El Naschie) papers were of "poor quality".
Prof Rossler replied: "On the contrary, they were very important and will become more important in the future."
And he added: "You are the most hard-working and diligent scientists I have ever met."

It is a useful to compare Rossler with Neil Turok, Nature's expert witness. Rossler is best known lately for warning that the Large Hadron Collider risked creating black holes that would destroy the world. See El Naschie Watch for this and other information. He is also a self-proclaimed simultaneous submitter, something that for journal editors is almost as bad as plagiarism. A comment on Rossler's claims concludes "To conclude: this text would not pass the referee process in a serious journal".

It seems that it is increasingly difficult to argue that Alexandria University's remarkable scores for research impact in the Times Higher Education World University Rankings were the result of outstanding, excellent or even controversial research.

I am not sure what an honorary editor is but at the moment Rossler is not listed as any sort of editor at the official Chaos Solitons and Fractals home page.


Incidentally, since the trial will presumably turn to the question of El Naschie's affiliations at some point, this page lists him as Founding Editor but does not give any affiliation.
The Politics of Ranking

One of the more interesting aspects of the university ranking business is the way it is used by local politicians to advance their agenda. This is especially obvious in Malaysia where errors and methodological changes have sent local universities bouncing up down the QS rankings. Every rise is proclaimed to be a vindication of government policy while every fall is accompanied by head shaking from the opposition.

This year Universiti Malaya moved into the QS top 200. There is nothing surprising about that: it has been there before. More significant was getting into the Shanghai Academic ranking of World Universities top 500. That is a lot harder but also more likely to reflect real underlying changes. It seems that UM has finally realised that a little bit of encouragement and financial support can produce quite significant results in a short period time.

Patrick Lee, in the blog of opposition leader Lee Kit Siang, comments that:  


PETALING JAYA: Malaysia has little to show for its universities despite spending more money on tertiary education than do many other countries.
Malaysian universities lag behind many counterparts in Asia, including those located in neighbouring countries like Thailand and Singapore, according to a World Bank report released today.
“While Malaysia spends slightly more than most countries on its university students, leading Malaysian universities perform relatively poorly in global rankings,” said the report, entitled Malaysia Economic Monitor: Smart Cities.
Citing the Quacquarelli Symonds (QS) World University Rankings 2010, it noted that Universiti Malaya (UM) was ranked 207th worldwide and 29th in Asia.
It also quoted a US News and World 2011 report on the World’s Best Universities, which put UM, Universiti Kebangsaan Malaysia, Universiti Sains Malaysia and Universiti Putra Malaysia at 167th, 279th, 335th and 358th place respectively.
Even more worrying, the World Bank report observed, was the “increasing gap” between Malaysia’s and Singapore’s universities.
It compared UM with the National University of Singapore (NUS), which QS cited as the leading university in Southeast Asia.
“The gap between UM and NUS has been high and generally increasing, especially in the sciences,” the report said.
According to the report, UM and NUS were on par when it came to science and technology in 2005. However, UM has lost out to NUS over the past six years.
The report also said many of Malaysia’s university graduates did not seem to have the skills that would help them get employment.

Firstly, the QS  and the US News and World Report rankings are the same. Secondly, it is a lot easier to start a university in Malaysia than in Singapore.

Even so, moving into the Shanghai rankings is a real advance and should be recognised as such. 
Update 4 on El Naschie vs Nature

Gervase de Wilde has an article on the case in Inform's Blog. He refers to the Guardian's account and then adds the following comment:


Comment
The case seems to offer ammunition to libel reformers. Even in the absence of the ill-advised and incoherent aspects of his case which were excluded before trial, and of the implicit comparisons of his work to Einstein’s made during the first five days at the High Court, his claim against a venerable and highly respected scientific journal seems a poor substitute for meeting their allegations head on in some form of correspondence or public debate. Moreover, the journal had published the Claimant’s own defence of his methods in running CSF, that he sought to emphasise scientific content above impressive affiliations, in the original article.

A spokesperson for the Libel Reform campaign, speaking to the Guardian, commented that reform can’t come soon enough, since
“Scientists expect publications like Nature to investigate and write about controversies within the scientific community. The threat of libel action is preventing scientific journals from discussing what is good and bad science.”

However, the public interest defence argued for by campaigners is one which is already being employed. The BBC reports that Andrew Caldecott QC’s opening statement for the Defendants described their defence as relying on the article being “true, honest opinion and responsible journalism on an issue of public interest”.

As the choice of witnesses indicates, the case does touch on the seemingly incomprehensible branch of physics in which the Claimant has made his academic career. In this respect there is a threat of a libel action stifling academic debate, and a similarity to BCA v Singh 2010 EWCA Civ 350, where opinions expressed in a controversy on what was essentially a scientific matter were at issue. But it is also about the methods he employed in running a publication in the context of a widely recognised system of accreditation and review, and about allegations regarding the professional affiliations which feature on his website. These are the kind of criticisms that might be made about any professional person, and would not necessarily come under scope of a scientific exception for “rigorous debate” on good and bad science urged by campaigners.

It is questionable whether the phrase "academic career" is appropriate since El Naschie has not apparently held any formal permanent academic posts recently unless one counts the award of an Emeritus Professorship by Alexandria University, a strange distinction since there is no sign of the professorship from which he retired.

Friday, November 18, 2011

Update 3 on El Naschie vs Nature

The Guardian has a substantial report on the case by Alo Jha. It seems that El Naschie believes that expert witness Neil Turok is unqualified to understand his work.It is difficult to see how this argument, even if valid, is relevant to the point of whether or not peer review took place.

Should the court decide in favour of El Naschie, it would provide some sort of justification for the methods used in the citations indicator in the Times Higher Education rankings which gave high scores to Alexandria University mainly or partly because of the many citations of papers by El Naschie.

El Naschie is suing Nature as a result of a news article published in 2008, after the scientist's retirement as editor-in-chief of the journal Chaos, Solitons and Fractals. The article alleged that El Naschie had self-published several research papers, some of which did not seem to have been peer reviewed to an expected standard and also said that El Naschie claimed affiliations and honorary professorships with international institutions that could not be confirmed by Nature. El Naschie claims the allegations in the article were false and had damaged his reputation.
On Friday, Nature called Professor Neil Turok, a cosmologist and director of the Perimeter Institute in Canada, as an expert witness to assess some of the work published by El Naschie.
Turok described his expertise as being in cosmology. "I work at the theoretical end of cosmology … my work consists of applying unified theories, such as string theory, to the most difficult questions in cosmology, namely the beginning of the universe or the initial singularity, the moment where everything was at a single point in the conventional description."
In his evidence, Turok said he found it difficult to understand the logic in some of El Naschie's papers. The clear presentation of scientific ideas was an important step in getting an idea accepted, he said. "There are two questions – one is whether the work is clearly presented and readers would be able to understand it. It would be difficult for a trained theoretical physicist to understand [some of El Naschie's papers]. I couldn't understand it and I made a serious attempt to understand it. The second question is about the correctness of the theory and that will be decided by whether it agrees with experiments. Most theories in theoretical physics are speculative – we form a logical set of rules and deductions and we try, ultimately, to test the deductions in experiments. For me, clear presentation is the first thing in the presentation of a theory."
In response, El Naschie pointed out that even Albert Einstein had made mistakes in his publications. "Einstein is the most sloppy scientist ever. He never defined his quantities, he doesn't put in references and he made so many mistakes of mathematics and concepts. He was a very natural man when he explained something to lay people. But Einstein, whom I admire very much because he had imagination and the courage to stand up to the bloody Nazis, Einstein was an extremely sloppy man."
Later in the session, El Naschie accused Turok of having "no idea" about mathematics and being unqualified to assess his work. "If somebody doesn't understand things, it's his own limitation," El Naschie said.

Thursday, November 17, 2011

Times Higher Social Science Rankings


1.  Stanford
2.  Harvard
3.  Oxford
The Influence of Rankings

Varsity, the student newspaper at Cambridge, suggests that British universities are recruiting staff in order to improve their position in the QS rankings:


Matthew Knight, chairman of Universities HR and the University of Leeds HR director, said: “Within the context of £9,000 fees, many universities have a strategic drive to improve the quality of the student experience.
“Therefore, many are taking the opportunity to improve student staff ratios regardless of the numbers of applicants. So there’s a lot of recruitment going on at some universities, although there’s no specific pattern to this.”
As the QS World University Rankings use student-faculty ratios as the only globally comparable indicator to determine their tables, an increase in employment can be used to promote a university’s image and attract students.

Wednesday, November 16, 2011

Update 2 on El Naschie and Nature

Note that New Scientist describes El Naschie as an "independent physicist". Does this imply that he has no affiliation and that Nature was correct in questioning his claims to academic status?
Update on El Naschie and Nature

The New Scientist has provided some coverage of the trial which is also discussed at El Naschie Watch. On November 15,  this item by Chelsea Whyte appeared:


Benjamin De Lacy Costello, a materials scientist at the University of the West of England in Bristol, UK, testified yesterday that when El Naschie was editor, the peer-review process at Chaos, Solitons and Fractals was "frustrating" and unlike that of other journals.

With regard to the dispute over El Naschie's affiliations, Timothy John Pedley, former head of the department of applied mathematics and theoretical physics at the University of Cambridge, said that El Naschie was a visiting scholar with access to libraries and collaborations at the department, but was not an honorary scholar working with the privileges of a professor.

On November 16 this update appeared:
Update: Mohamed El Naschie, a former editor of the scientific journal Chaos, Solitons and Fractals, appeared in London's High Court today for the libel lawsuit he has brought against the scientific journal Nature.

El Naschie is representing himself.
During El Naschie's cross-examination of journalist Quirin Schiermeier, who wrote the 2008 article about him, Schiermeier stood by the content of the work, saying, "We wrote the article because you published 58 papers in one year in a journal where you acted as editor-in-chief. That is unusual and potentially unethical."

El Naschie responded that he felt it wasn't unheard of for journals to publish work that isn't peer-reviewed. He also said that his work had been stolen. "We published my work to secure it," he told the court. "Senior people are above this childish, vain practice of peer review."

I am not an expert, but it seems that El Naschie does not appear to dispute any longer  that his pattern of self-publication was unusual or that there had  been little or no peer review. He is simply claiming that publication was necessary to preempt the theft of his work by rivals and that the absence of peer review was excused by his seniority. Whether that is inconsistent with Nature's comments is, I assume, a matter for the judge to decide.


El Naschie and Nature

The El Naschie vs Nature case is under way at the Royal Courts of Justice in London.

Briefly, Mohamed El Naschie, the former editor of the journal Chaos, Solitons and Fractals, is suing the journal Nature and the writer Quirin Schiermeier for its comments on the journal's publication of many of his own papers.

El Naschie is claiming that he was defamed by the suggestion that his papers were of poor quality and were published without a normal peer review process. He also claims that he had been defamed by the imputation that he had claimed academic affiliations to which he was not entitled.

The case is of vital importance to academic freedom since if successful it would mean that wealthy persons could stifle even the most balanced and temperate comments on scientific and scholarly activities.

 It is also of importance to the question of international university ranking since El Naschie's unusual self-publication and self-citation within a short period of time in a field where citations are low allowed Alexandria University to achieve an extraordinarily high score in the 2010 Times Higher Education World University Rankings. Even this year,  the university had a n unreasonably high score in the ranking's research impact indicator. If El Naschie were successful in his claim then Times Higher and Thomson Reuters, who collected and analysed the data for the rankings, would be able to argue that they had uncovered a small pocket of excellence.

The case has been covered extensively in El Naschie Watch and has been discussed in the scientific press.

Updates will be provided from time to time.

Tuesday, November 15, 2011

The THE Subject Rankings

The ranking seasons has drawn to a close, or at least it will when we have digested the feasibility report from the European Commission's U-Multirank project. Meanwhile, to tie up some loose ends, here are the top 3 from each of THE's subject group rankings.

Engineering and Technology

1.  Caltech
2.  MIT
3.  Princeton

Arts and Humanities

1.  Stanford
2.  Harvard
3.  Chicago

Clinical, Pre-Clinical and Health

1.  Oxford
2.  Harvard
3.  Imperial College London

Life Sciences

1.  Harvard
2.  MIT
3.  Cambridge

Physical Sciences

1.  Caltech
2.  Princeton
3.  UC Berkeley

Social Sciences

To be posted on the 17th of November.

Monday, November 07, 2011

Conference in Shanghai

I hope to post something in a day or two on the recent World Class Universities conference in Shanghai. Meanwhile, there is an interesting comment by Alex Usher of Higher Education Strategy Associates, a Canadian consulting firm.

"In discussions like this the subject of rankings is never far away, all the more so at this meeting because its convenor, Professor Nian Cai Liu, is also the originator of the Academic Ranking of World Universities, also known as the Shanghai Rankings. This is one of three main competing world rankings in education, the others being the Times Higher Education Supplement (THES) and the QS World Rankings.

The THES and QS rankings are both commercially-driven exercises. QS actually used to do rankings for THES, but the two parted ways a couple of years ago when QS’s commercialism was seen to have gotten a little out of hand. After the split, THES got a little ostentatious about wanting to come up with a “new way” of doing rankings, but in reality, the two aren’t that different: they both rely to a considerable degree on institutions submitting unverified data and on surveys of “expert” opinion. Shanghai, on the other hand, eschews surveys and unverified data, and instead relies entirely on third-party data (mostly bibliometrics).

In terms of reliability, there’s really no comparison. If you look at the correlation between the indicators used in each of the rankings, THES and QS are very weak (meaning that the final results are highly sensitive to the weightings), while the Shanghai rankings are very strong (meaning their results are more robust). What that means is that, while the Shanghai rankings are an excellent rule-of-thumb indicator of concentrations of scientific talent around the world, the QS and THES rankings in many respects are simply measuring reputation.

(I could be a bit harsher here, but since QS are known to threaten academic commentators with lawsuits, I’ll be circumspect.)

Oddly, QS and THES get a lot more attention in the Canadian press than do the Shanghai rankings. I’m not sure whether this is because of a lingering anglophilia or because we do slightly better in those rankings (McGill, improbably, ranks in the THES’s top 20). Either way, it’s a shame, because the Shanghai rankings are a much better gauge of comparative research output, and with its more catholic inclusion policy (500 institutions ranked compared to the THES’s 200), it allows more institutions to compare themselves to the best in the world – at least as far as research is concerned. "

Some technical points. First, Times Higher Education Supplement changed its name to Times Higher Education when it converted to a magazine format in 2008.

Second, the Shanghai rankings are not entirely free from commercial pressures themselves although that has probably had the laudable effect of maintaining a stable methodology since 2003.

Third, both THE and QS accept data from institutions but both claim to have procedures to validate them. Also, the Shanghai rankings do include data from government agencies in their productivity per capita criterion and in some places that might not be any more valid than data from universities.

Fourth, until recently there has been a significant difference in the expert opinion used by THE and by QS. Most of QS's survey respondents were drawn from the mailing lists of the Singapore- and London- based academic publishers, World Scientific,  while THE's are drawn from those who have published papers in the ISI indexes. All other things being equal, we would expect THE's respondents to be more expert. This year the difference has been reduced somewhat as QS are getting most of their experts from the Mardev lists supplemented by a sign up facility.

Fifth, although THE publish a list of 200 universities in print and on their site, there is a fairly easily downloadable iphone app available that lists 400 universities.

The most important point though is the question of consistency. It is quite true that the various indicators in the Shanghai rankings correlate quite closely or very closely with one another (.46 to .90 in 2011 according to a conference paper by Ying Chen  and Yan Wu of the Shanghai Center for World- Class Universities) while some of those in the QS and THE rankings have little or no relation to one another. However, it could be argued that if two indicators show a high correlation with one another then they are to some extent measuring the same thing and one of them is redundant. Still, that is probably better than indicators which statistically have little to do with one another.

What is more important perhaps is the consistency from one year to another. The main virtue of the Shanghai rankings is that changes in position can be assumed to reflect actual real world changes whereas those in the THE and QS rankings could easily be the result of methodological changes or, in the case of THE, omissions or inclusions.

Friday, October 28, 2011

An Error

This replaces an earlier post.

Last year Times Higher Education admitted to an error involving Monash University and the University of Adelaide

Also, after the launch of the World University Rankings 2010 it became apparent that, owing to a data processing error, the ranking positions of two Australian universities in the top 200 list were incorrect — the University of Adelaide and Monash University.

Both universities remain in the top 1 per cent of world universities.


This year, a representative of Adelaide commented on the error: 


Adelaide's DVCR Mike Brooks said it had been "disconcerting'' that there had been a data processing error last year in the first iteration of the revised rankings since their split from QS. "It certainly raises further questions about the credibility of the rankings,'' Professor Brooks said.

"Based on our own analysis we believe that we have a similar ranking this year to that of 2010. The shift in position is attributed to the error in the processing last year, ongoing changes in THE methodology and increased competition.''

"I think the students and the wider community are able to judge for themselves.  As South Australia's leading research-university and only member of the Group of Eight, I know that we are in an incredibly strong position for the future.''

Adelaide's fall seems to have been due very largely to a massive fall in the score for research impact. How much of this was due to the correction of the 2010 error, how much to changes in methodology and how much to the inherent instability of the normalisation procedure is not clear

Monday, October 17, 2011

GLOBAL: Despite ranking changes, questions persist 

 My article on the Times Higher Education World University Rankings can be accessed at University World News

The international university ranking scene is starting to look like the heavyweight boxing division. Titles are proliferating and there is no longer an undisputed champion of the world. Caltech has just been crowned top university by Times Higher Education and Thomson Reuters, their data collectors and analysts, while QS have put Cambridge in first place. Over at Webometrics, MIT holds the number one spot. But Harvard has the consolation of remaining the top university in the Scimago and HEEACT rankings as well as the Academic Ranking of World Universities, ARWU, published by Shanghai Jiao Tong University.

Read here