Showing posts sorted by date for query euro. Sort by relevance Show all posts
Showing posts sorted by date for query euro. Sort by relevance Show all posts

Friday, December 29, 2017

More on Japan and the Rankings

The Japan Times recently published an article by Takamitsu Sawa, President and Distinguished Professor at Shiga University, discussing the apparent decline of Japan's universities in the global rankings.

He notes that in 2014 there were five Japanese universities in the top 200 of the Times Higher Education (THE) world rankings but only two in 2016. He attributes Japan's poor performance to the bias of the citations indicator towards English language publications and the inability or reluctance of Japanese academics to write in English. Professor Sawa seems to be under the impression that THE does not count research papers not written in English, which is incorrect. It is, however, true that the failure of Japanese scholars to write in English prevents their universities doing better in the rankings. He also blames lack of funding from the government and the Euro-American bias of the THE reputation survey.

The most noticeable thing about this article is that the author talks about exactly one table, the THE World University Rankings. This is unfortunately very common especially among Asian academics, There are now over a dozen global rankings of varying quality and some of them tell a different, and perhaps more accurate, story than THE's. For example, there are several well known international rankings in which there are more Japanese universities in the world top 200 than there are in THE's.

There are currently two in the THE top 200 but seven in the Shanghai Academic Ranking of World Universities (ARWU), ten in the QS World University Rankings, ten in the Russian Round University Rankings, seven in the CWTS Leiden Ranking total publications indicator and ten in the Nature Index.

Let's now take a look at the University of Tokyo (Todai), the country's best known university, and it's position in these rankings. Currently it is 46th in the world in THE but in ARWU it is 23rd, in QS 28th, in Leiden Ranking tenth for publications and tenth in the Nature Index. RUR put the university in 43rd place, still a little better than THE. It is very odd that Professor Sawa should focus on the rankings that puts Japanese universities in the worst possible light and ignore the others.

As noted in an earlier post, Tokyo's tumble in the THE rankings came suddenly in 2015 when THE made some drastic changes in its methodology, including switching to Scopus as data supplier, excluding papers with large numbers of authors such as those derived from the CERN projects, and applying a country adjustment to half instead of all the citations indicator. Then in 2016 THE made further changes for its Asian rankings that further lowered the scores of Japanese universities.

It is true that scores of leading Japanese universities in most rankings have drifted downwards over the last few years but this is a relative trend caused mainly by the rise of a few Chinese and Korean universities. Japan's weakest point, as indicated by the RUR and THE rankings, is internationalisation. These rankings show that the major Japanese universities still have strong reputations for postgraduate teaching and research while the Nature Index and the Leiden Ranking point to an excellent performance in research in the natural science at the highest levels.

Nobody should rely on a single ranking and changes caused mainly by methodological tweaking should be taken with a large bucket of salt.

Monday, May 22, 2017

Arab University Rankings: Another Snapshot from Times Higher Education

Times Higher Education (THE) has produced a "snapshot" ranking of Arab universities extracted from its World University Rankings. There has been no change in the indicators or their weighting. Only 28 universities are included which raises questions about how suitable THE's methodology is for regions like the Middle East and North Africa.

This is an improvement over a remarkable MENA snapshot that THE did in 2015 which put Texas A and M University Qatar in first place by virtue of one half-time faculty member who was listed as a contributor to a multi-author, multi-cited CERN paper.

The top five universities this time are King Abdulaziz University (KAU), Saudi Arabia, King Fahad University of Petroleum and Minerals, Saudi Arabia, King Saud University, Saudi Arabia, Khalifa University of Science, Technology and Research, UAE, and Qatar University.

The top three places are held by Saudi institutions. So how did they do it? According to an article by a THE editor for the World Economic Forum it was all due to money and internationalisation. 

Up to a point that is correct. The sad story of Trinity College Dublin's botched data submission shows roughly how much a given reported increase can affect a university's overall rank: roughly 5 million Euro in reported total income (with proportionate increases for research income and income from industry) and a middling high university can jump up a spot in the overall rankings.

But does that explain KAU in top place? It did get a high score, 92.1, for international orientation but five other Arab universities did better.  For teaching it was third, for industry income third, and for research seventh. What actually made the difference was Citations. KAU had a score of 93.3, far ahead of the next contender, Jordan University of Science and Technology with 50.2.

KAU's research impact is, according to THE, second in Asia only to the shooting star of india, Vel Tech University whose single self citing prodigy supposedly had a greater impact than the whole of any other Asian university. KAU's citations score was the result of the massive recruitment of adjunct faculty, 40 at the last count, who put KAU as a second affiliation. How much time they put in at KAU is uncertain but the Shanghai rankings calculated that highly cited researchers spent an average of 16% of their time at the university of their second affiliation.

It is bad enough that THE put so much emphasis on income and internationalisation in their methodology, promoting the diversion of resources from things like primary and pre-school education, adult literacy and alternatives to oil exports. To encourage universities to rise in the rankings  by hiring adjunct faculty whose contribution is uncertain is very irresponsible. It would be a good thing if this snapshot was ignored.

Wednesday, April 12, 2017

Exactly how much is five million Euro worth converted into ranking places

One very useful piece of information to emerge from the Trinity College Dublin (TCD) rankings fiasco is the likely effect on the rankings of injecting money into universities.

When TCD reported to Times Higher Education THE  that it had almost no income at all, 355 Euro in total, of which 111 Euro was research income and 5 Euro from industry, it was ranked in the 201 - 250 band in the world university rankings. Let's be generous and say that it was 201st. But when the correct numbers were inserted, 355 million in total (of which 111 million is research income and 5 million industry income) it was in 131st= place.

So we can say crudely that increasing (or rather reporting) overall institutional income by 5 million Euro (keeping the proportions for research income and industry income constant) translates into one place in the overall world rankings.

Obviously this is not going to apply as we go up the rankings. I suspect that Caltech will need a lot more than an extra 5 million Euro or 5 million anything to oust Oxford from the top of the charts.

Anyway, there it is. Five million Euro and the national flagship advances one spot up the THE world rankings. It sounds a lot but when the minister for arts, sports and tourism spends 120 Euro for hat rental, and thousands for cars and hotels, there are perhaps worse things the Irish government could do with the taxpayers' money.

Thursday, April 06, 2017

Trinity College Shoots Itself in the Other Foot

The story so far. Trinity College Dublin (TCD) has been flourishing over the last decade according to the Shanghai and Round University Rankings (RUR) world rankings which have a stable methodology.  The university leadership has, however, been complaining about its decline in the Times Higher Education (THE) and QS rankings, which is attributed to the philistine refusal of the government to give TCD the money that it wants.

It turns out that the decline in the THE rankings was due to a laughable error. TCD had submitted incorrect data to THE, 355 Euro for total income, 111 for research income and 5 for income from industry instead of 355 million, 111 million and 5 million. Supposedly, this was the result of an "innocent mistake." 

Today, the Round University Rankings released their 2017 league table. These rankings are derived from Global Institutional Profiles Project (GIPP) run by Thomson Reuters and now by Clarivate Analytics and used until 2014 by THE. TCD has fallen from 102nd place to 647th, well below Maynooth and the Dublin Institute of Technology. The decline was catastrophic for the indicators based on institutional data and very slight for those derived from surveys and bibliometric information.

What happened? It was not the tight fists of the government. TCD apparently just submitted the data form to GIPP without providing data. 

No doubt another innocent mistake. It will be interesting to see what the group of experts in charge of rankings at TCD has to say about this.

By the way, University College Dublin continues to do well in these rankings, falling a little bit from 195th to 218th. 

Tuesday, April 04, 2017

The Trinity Affair Gets Worse

Trinity College Dublin (TCD) has been doing extremely well over the last few years, especially in research. It has risen in the Shanghai ARWU rankings from the 201-300 to the 151-200 band and from 174th to 102nd  in the RUR rankings.

You would have thought that would be enough for any aspiring university and that they would be flying banners all over the place. But TCD has been too busy lamenting its fall in the Times Higher Education  (THE) and QS world rankings, which it attributed to the reluctance of the government to give it as much money as it wanted. Inevitably, a high powered Rankings Steering Group headed by the Provost was formed to turn TCD around.

In September last year the Irish Times reported that the reason or part of the reason for the fall  in the THE world rankings was that incorrect data had been supplied.  The newspaper said that:

"The error is understood to have been spotted when the college – which ranked in 160th place last year – fell even further in this year’s rankings.
The data error – which sources insist was an innocent mistake – is likely to have adversely affected its ranking position both this year and last. "
I am wondering why "sources" were so keen to insist that it was an innocent mistake. Has someone been hinting that it might have been deliberate?

It now seems that the mistake was not just a misplaced decimal point. It was a decimal point moved six places to the left so that TCD reported a total income of 355 Euro, a research income of 111 Euro and 5 Euro income from industry instead of 355, 111, and 5 million respectively. I wonder what will happen to applications to the business school.

What is even more disturbing, although perhaps not entirely surprising, is that THE's game-changing auditors did not notice.

Tuesday, July 05, 2016

Has someone been upgrading the simulation?

An article in the Independent by Matthew Norman suggests that we have slipped into a parallel universe, propelled through hyper-space into another reality. The evidence is Jeremy Corbyn as Leader of the Opposition, Leicester City top of the English Premier League, Brexit, Novak Djokovich losing at Wimbledon and so on.

My suspicion is that we haven't landed in a parallel universe. It is more likely that that we living in a computer simulation which from time to time needs to be updated, leading to temporary anomalies like neutrinos going backwards in time or Wales advancing to the Euro 2016 semi-finals.

Perhaps we will wake up tomorrow and find that something even more improbable has occurred, the inauguration of President Trump, Pete Best joining Paul McCartney for a reunion tour or Boris Johnson going into a monastery.

Or Google Inc. as number five research institution in the world, up from 195th two years ago, ahead of Yale and Princeton. That might actually give us a clue as to who is running the simulation.

Saturday, August 04, 2012

QS Stars

University World News (UWN) has published an article by David Jobbins about QS Stars, which are awarded to universities that pay (most of them anyway) for an audit and a three year licence to use the stars and which are shown alongside the listings in the  QS World University Rankings. Participation is not spread evenly around the world and it is mainly medioce universities or worse that have signed up according to a QS brochure. Nearly half of the universities that have opted for the stars are from Indonesia.

Jobbins refers to a report in Private Eye which in turn refers to the Irish Examiner. He writes:

The stars appear seamlessly alongside the listing for each university on the World University Rankings, despite protestations from QS that the two are totally separate operations.

The UK magazine Private Eye reported in its current issue that two Irish universities – the University of Limerick and University College Cork, UCC – had paid “tens of thousands” of euro for their stars.

The magazine recorded that UCC had told the Irish Examiner that the €22,000 (US$26,600) cost of obtaining the stars was worthwhile, as it could be recouped through additional international student recruitment.

The total cost for the audit and a three-year licence is US$30,400, according to the scheme prospectus.

 The Irish Examiner article by Neil Murray is quite revealing about the motivation for signing up for an audit:

UCC paid almost €22,000 for its evaluation, which includes a €7,035 audit fee and three annual licence fees of €4,893. It was awarded five-star status, which it can use for marketing purposes for the next three years.

The audit involved a visit to the college by QS researchers but is mostly based on analysis of data provided by UCC on eight criteria. The university’s five-star rating is largely down to top marks for research, infrastructure, internationalisation, innovation, and life science, but it got just three stars for teaching and engagement.
About 3,000 international students from more than 100 countries earn UCC approximately €19 million a year.

UCC vice-president for external affairs Trevor Holmes said there are plans to raise the proportion of international students from 13% — one of the highest of any Irish college — to 20%.

"Should UCC’s participation in QS Stars result in attracting a single additional, full-time international student to study at UCC then the costs of participation are covered," he said.

"In recent times, unlike many other Irish universities, UCC has not been in a position to spend significant sums on marketing and advertising domestically or internationally. QS Stars represents a very cost-effective approach of increasing our profile in international media and online."
So now we know how much a single international student adds to the revenue of an Irish university.

So far, there is nothing really new here. The QS Stars system has been well publicised and it probably was a factor in Times Higher Education dropping QS as its data collecting partner and replacing them with Thomson Reuters.

What is interesting about the UWN article is that a number of British and American universities have been given the stars without paying anything. These include Oxford and Cambridge and 12 leading American institutions that are described by QS as "independently audited based on publicly available information". It would be interesting to know whether the universities gave permission to QS to award them stars in the rankings. Also, why are there differences between the latest rankings and the QS brochure? Oxford does not have any stars in last year's rankings but is on the list in the brochure. Boston University has stars but is not on the list. It may be just a matter of updating.

It would probably be a good idea for QS to remove the stars from the rankings and keep them in the university profiles.

Wednesday, April 18, 2012

The Euro-Rankings

The U-Multirank project has slowly crept past the feasibility study stage. While any attempt to undermine the triopoly of the big three global rankers, ARWU (Shanghai), QS, Times Higher Education, is welcome, it does seem to be taking a long time. One wonders whether it will get going before the Eurozone disintegrates. Still the project does have its defenders.

"Jordi Curell, director of lifelong learning, higher education and international affairs at the directorate general for education and culture, conceded that there was opposition to its development.

"When we started working on the project of U-Multirank, many people from the higher education community were opposed to it,” he told an international symposium on university rankings and quality assurance in Brussels on 12 April.

But the system had intrinsic value, he said, because it would provide an evidence-based measure of the performance of European universities that would help them improve.

According to Curell, if higher education is to help Europe emerge from its current financial and economic crisis, the EU needs to know how its universities are performing and universities need to know how they are doing.
"Rankings which are carefully thought out are the only transparency tools which can give a comparative picture of higher education institutions at a national, European and global level," he told the symposium."

There are critics of course. One of them is a committee of the British House of Lords which has argued that U-Multirank is a waste of money. Four million Euros sounds rather a lot but considering what the EU has spent money on, it is trivial.

And as for wasting the taxpayer's money, well, the committee could look at the other house and think about a  floating duck island, pantyliners, nappies, soundproofed bedrooms and so on and so on.

Tuesday, June 16, 2009

An Alternative Global Ranking


Finally the decision on who has won the European Commission’s million euro tender – to develop and test a global ranking of universities – has been announced.

The successful bid – the CHERPA network (or the Consortium for Higher Education and Research Performance Assessment), is charged with developing a ranking system to overcome what is regarded by the European Commission as the limitations of the Shanghai Jiao Tong and the QS-Times Higher Education schemes. The final product is to be launched in 2011.

CHERPA is comprised of a consortium of leading institutions in the field within Europe; all have been developing and offering rather different approaches to ranking over the past few years (see our earlier stories here, here and here for some of the potential contenders):CHE – Centre for Higher Education Development (G├╝tersloh, Germany)Center for Higher Education Policy Studies (CHEPS) at the University of Twente (Netherlands)Centre for Science and Technology Studies (CWTS) at Leiden University (Netherlands)Research division INCENTIM at the Catholic University of Leuven (Belgium)Observatoire des Sciences et des Techniques (OST) in ParisEuropean Federation of National Engineering Associations (FEANI)European Foundation for Management Development (EFMD)

Thursday, May 03, 2007

Book Review

This is a draft of a review that may appear shortly in an academic journal.

Guide to the World’s Top Universities, John O’Leary, Nunzio Quacquarelli and Martin Ince. QS Quacquarelli Symonds Ltd.: London. 2006.

The THES (Times Higher Education Supplement)-QS World University Rankings have aroused massive interest throughout the world of higher education, nowhere more so than in East and Southeast Asia. Very few university teachers and administrators in the region can be unaware of the apparent dramatic collapse of quality at Universiti Malaya, which was in fact nothing of the sort. That this resulted from nothing more than an error by THES’s consultants and its belated correction has done little to diminish public fascination.

Now, QS Quacquarelli Symonds, the consultants who compiled the data for the rankings, have published a large 512-page volume. The book, written by John O’Leary and Martin Ince of THES and Nunzio Quacquarelli of QS, comes with impressive endorsements. It is published in association with IELTS, TOEFL and ETS, names that quite a few Asian students and teachers will know, and is distributed by Blackwell Publishing of Oxford. At the top of the front cover, there is a quotation from Tim Rogers, former Head of Student Recruitment and Admissions, London School of Economics: “A must – have book for anyone seeking a quality university education at home and abroad.” Tim Rogers, by the way, has been a consultant for QS.

The Guide to the World’s Top Universities certainly contains a large amount of material. There are thirteen chapters as follows.

Welcome to the world’s first top university guide
Ranking the world’s universities
How to choose a university and course
The benefits of studying abroad
What career? Benefits of a top degree
Tips for applying to university
What parents need to know -- guide to study costs and more
Financing and scholarships
The world’s top 200 universities. This is the ranking that was published last year in the THES.
The world’s top universities by subject. This was also published in the THES.
The top 100 university profiles. This provides two pages of information about each university.
12. The top ten countries
13. Directory of over 500 top world universities.

Basically, there are two parts. The earlier chapters mostly consist of advice that is generally interesting, well written and sensible. Later, we have data about various characteristics of the universities, often ranking them in order. The latter comprise much of the book. The profiles of the top 100 universities take up 200 pages and the directory of 500 plus universities another 140.

So, is this a must-have book? At ₤19.99, $35.95 or Euro 28.50 the answer has to be not really. Maybe it would be a good idea to glance through the earlier advisory chapters but as a source of information and evaluation it is not worth the money. First of all, there are serious problems with the information presented in the rankings, the profiles and the directory. The book’s credibility is undermined by a succession of errors, indicating an unacceptable degree of carelessness. At 35 dollars or 20 pounds we surely have the right to expect something a little better, especially from the producers of what is supposed to be “the gold standard” of university rankings.

Thus we find that the Technical University of Munich appears twice in the profiles in positions 82 (page 283) and 98 (Page313). The latter should be the University of Munich. In the directory the University of Munich is provided with an address in Dortmund (page 407). The Technical University of Helsinki is listed twice in the directory (pages 388 and 389). A number of Swiss universities are located in Sweden (pages 462 and 463). The authors cannot decide whether there is only one Indian Institute of Technology and one Indian Institute of Management (page 416) or several (pages 231 and 253). New Zealand is spelt ‘New Zeland’ (page 441). The profile for Harvard repeats the same information in the factfile under two different headings (page 119). There is something called the ‘Official University of California, Riverside’ on page 483. Kyungpook National University in Korea has a student faculty ratio of zero (page 452). Something that is particularly irritating is that the authors or their assistants still cannot get the names of Malaysian universities right. So we find ‘University Putra Malaysia’ on page 435 and ‘University Sains Malaysia’ on page 436. After that famous blunder about Universiti Malaya’s international students and faculty one would expect the authors to be a bit more careful.

Still, we must give some credit. At least the book has at last started to use the right name for China’s best or second best university – Peking University, not Beijing University -- and ‘University of Kebangsaan Malaysia’ in the 2006 rankings in the THES has now been corrected to ‘Universiti Kebangsaan Malaysia’.

The Guide really gets confusing, to put it mildly, when it comes to the number of students and faculty. A perceptive observer will note that the data for student-faculty ratio in the top 200 rankings reproduced in chapter 9 is completely different from those in the profiles in chapter 11 and the directory in chapter 13.

For example, in the rankings Duke University, in North Carolina, is given a score of 100, indicating the best student faculty ratio. Going to QS’s topuniversities website we find that Duke supposedly has 11,106 students and 3,192 faculty, representing a ratio of 3.48 students per faculty. But then we turn to the profile and see that Duke is assigned a ratio of 16.7 students per faculty (page 143). On the same page we are told that Duke has 6,301 undergraduates and 4,805 postgraduates and “just under 1,600 faculty”. That makes a ratio of about 6.94. So, Duke has 3.48 or 6.94 or 16.7 students per faculty. Not very helpful.

Looking at Yale University, the book tells us on the same page (127) that the student faculty ratio is 34.3 and that the university has “around 10,000 students” and 3,333 faculty, a ratio of 3 students for each faculty member.

On page 209 we are told that the University of Auckland has a student–faculty ratio of 13.5 and in the adjacent column that it has 2,000 academic staff and 41, 209 students, a ratio of 20.6. Meanwhile, the top 200 rankings give it a faculty student score of 38 which works out at a ratio of 9.2. So, take your pick from 9.2, 13.5 and 20.6.

The data for research expertise is also contradictory. Universities in Australia and China get excellent scores for the “peer review” of best research in the rankings of the top 200 universities in chapter 9 but get relatively poor scores for research impact. The less glamorous American universities like Boston and Pittsburgh get comparatively low scores for peer review of research but actually do excellent research.

Errors and contradictions like these seriously diminish the book’s value as a source of information.

It would not be a good idea to buy this book although it might be worth looking at the early chapters if you can borrow it from a library. To judge the overall global status of a university, the best bet would be to look east and turn to at the Shanghai Jiao Tong University Index, available on the Internet, which ranks the top 500 universities. This index focuses entirely on research but there is usually at least a modest relationship between research activity and other variables such as the quality if the undergraduate student intake and teaching performance. Those thinking about going to the US should look at the US News and World Report ‘s America’s Best Colleges. Anyone concerned about costs – who isn’t? -- should look at Kiplinger’s Index, which calculates the value for money of American universities. Incidentally, the fifth place here goes to the State University of New York at Binghamton, which is not even mentioned in the Guide. The Times (which is not the same as the Times Higher Education Supplement) and Guardian rankings are good for British universities.

Students who are not certain about going abroad or who are thinking about going to a less well known local institution could try doing a Google Scholar search for evidence of research proficiency and a Yahoo search for miscellaneous activity. Whatever you do, it is not a good idea to rely on any one source alone and certainly not this one.