Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, April 08, 2010
An article by Andrew Trounson in the Australian discusses the developng rivalry between the Times Higher Education and the QS rankings and its implications for Australian universities.
Both ranking groups are attempting to increase the number of responses to the academic surveys that will be in both rankings. This might have serious consequences for Australian universities which always did better in the survey-based components of the THE_QS rankings than in the other indicators and which now face the prospect of losing ground in a bigger and more diverse survey.
Friday, March 26, 2010
There has been a lot of ranking-related activity over the last few days.
- Phil Baty of Times Higher Education and the QS team of John O'Leary, Martin Ince, Nunzio Quacquarelli and Ben Sowter have given presentations at the British Council Going Global 4 conference in London. Phil Baty was apologetic over the flaws of the old THE-QS rankings while the QS team saw no reason change.
- The Economist has an article "Leagues Apart" that briefly reviews the development of international university rankings. Observations include the volatility of the rankings. Perhaps inevitably the example chosen is the fall of LSE after QS introduced standardised scores which helped universities that produced more citations.
- Phil Baty in THE comments on the problems of assessing the quality of teaching in unversities.
Saturday, March 20, 2010
QS have answered some questions over at QS TOPUNIVERSITIES. Here are some of the questions and ATFAQS (Answers to ...) or extracts and some comments.
"1) How do you plan to address the perceived bias towards English-speaking (and particularly UK) universities?
...The reality is, however, that in many areas of university competitiveness, operating in English is an advantage. English language journals are more widely read and cited, the top four destinations for international students (and I suspect also faculty) are the US, Canada, UK and Australia – all English speaking. Many universities in non-English speaking Asia, recognising this are operating more programs in English and all global rankings currently carry this bias, not just ours. Our objective is to minimise the bias, but it is far from clear whether eliminating it entirely would be appropriate."
Fair enough. But the bias within the English speaking world (the high scores for Oxbridge, the London schools and colleges, Edinburgh and Australian universities compared to the US and Canada) in the THE-QS rankings was probably more significant.
"3) Following the launch of the government-funded Assessment of Higher Education Learning Objectives (AHELO) pilot scheme, how do you respond to the suggestion that an insufficient emphasis is given to teaching standards and student skills within the more research-oriented established methodologies?
QS absolutely concurs that teaching and learning is inadequately embraced in any of the existing global rankings, including our own and is watching the AHELO exercise with great interest to see if lessons can be drawn and applied to the much broader geographical scope of our rankings. QS is also assessing whether student and alumni inputs can help draw a clearer picture of comparative performance in teaching and learning. On the student skills side of things, QS is currently the only global ranking taking this aspect seriously – via the Employer Review indicator."
Assessing the quality of teaching has so many pitfalls that it may never be possible to do it objectively on an international scale. A global version of RateMyProfessor might be feasible but there is obvious potential for rigging. It also has to be said that for more proficient students -- and that would include many or most of those in universities that will be in the top 200 0r 300 in any sort of ranking -- teaching is largely irrelevant. I doubt if any high fliers from the Ivy League or the grandes ecoles were ever quizzed by interviewers about the staff-student ratio in their classes or whether their instructors explained desired learning outcomes or whether they felt safe in their lecture halls. If teaching is to be assessed an opinion survey is probably no worse than anything else that might be proposed.
"4) Do you think that the low ranking of LSE in the 2009 rankings (67th) is reflective of an inherent bias toward scientific subjects within citations-based methodologies, and if so how do you plan to address this in 2010?
The QS World University Rankings™ are designed to assess the all-round quality of universities across all disciplines and levels, in teaching, research, employability and internationalisation. LSE is a fantastic institution, as is reflected by their persistent high position in the Social Sciences – the faculty area in which they are focused. In fact, it is so strong with its narrower focus that it manages to compete with world leading institutions with a much broader range. Even if we only take the proportion of world universities recognised by UNESCO a Top 100 placing represents the top 1% - a prolific achievement for an institution that focuses on only a small part of the academic spectrum. To put things in perspective, LSE fails to break the top 200 in the Shanghai Ranking."
It seems that the position of LSE in the forthcoming rankings will be closely watched. Yes, there has been a bias against institutions with strengths in the social sciences and this may be corrected in the THE rankings but anything that benefits LSE will also benefit general universities as much or more.
Thursday, March 18, 2010
QS are recruiting a Research Assistant for their London office, presumably to work on the data collection for the 2010 rankings.
"You will have analytical insight and familiarity with working with large data sets as well as you will be an effective communicator both personally and in writing. You will be results-oriented and dedicated to contributing to the success and development of our business unit and its research outputs. Responsibilities include Data collection gathering correct information from universities directly via email website telephone or third party sources Data entry accurate data entry into existing online database Correspondence dealing with university representatives or third party clients handling enquiries promoting the products Research research the web or other applicable sources for useful information Research Outputs contributing to high quality and insightful research outputs Gathering the correct information from universities can be a challenging task and often requires a surprising level of skill tenacity and diplomacy as well as a healthy appetite for problem solving. Therefore Skills attributes required Ability to stay focused and high attention to detail -Tenacity diplomacy and reliability Healthy appetite for problem solving Inquisitive mind and genuine interest Good communication Effective time management Commitment and Enthusiasm Excellent knowledge and experience of office software applications Additional languages desirable. "
It sounds like they are getting serious. The additional languages might be significant. But this bit at the end is surprising.
"This is a full time position requiring a minimum of 35 hours per week and a maximum of 40."
A maximum work week of 40 hours! I wonder if there are any universities anywhere in the world who have that. And I bet they don't in Shanghai.
Friday, March 12, 2010
There is a detailed and thorough review of recent developments by David Jobbins in University World News:
First shots fired in ranking war
There has been a lot news from the rankosphere over the last week.
On the 8th of March QS World University Rankings announced that they were launching their 2010 Research.
"largest review of international universities ever conducted
* Over 2000 participating universities from more than 130 countries
* Over 200,000 university selections by academics, for excellence in research quality*
* 5000 participating employers"
If the response is similar to last year when the average academic reviewer listed 12 universities, then 200,000 university selections would mean about 17,000 respondents, quite a big jump. Two thousand participating universities would mean more than doubling the number of universities assessed, a very good idea in principle, although there could be logistical problems and, of course, the chances of embarassing errors will increase.
Meanwhile, QS have started a new newsletter QS Rankings & Global Higher Education Trends and also started a question and answer page.
On the 11th of March, Times Higher Education announced:
"The biggest and most ambitious project to measure universities' academic reputation for the Times Higher Education World University Rankings was launched this week.
Thomson Reuters, the exclusive data supplier and analyst for the THE rankings in 2010 and beyond, unveiled its Academic Reputation Survey in Philadelphia on 11 March.
Over the coming weeks, thousands of academics around the world, who have been carefully selected as being statistically representative of the global academic workforce, will be asked to complete a short, invitation-only survey to state which in their opinion are the strongest universities in their fields of expertise.
In a major new development, the survey will gather opinions on the standards of both research and teaching, raising the prospect of the first worldwide reputation-based measure of teaching quality in higher education. "
It sounds like THE are going to draw much of their survey sample from the database of Thomson Reuters. In other words they will survey only or mainly published researchers, which is highly appropriate if research quality is the only thing that is being assessed. Now that THE are going to ask about teaching quality, it might be worth thinking about also surveying teaching-only university staff and undergraduate and postgraduate students.
So we are going to have the largest review ever conducted versus the biggest and most ambitious project. Whatever happened to that British gift for understatement?
Thursday, March 11, 2010
Over the last few years university rankings have acquired a large audience. Each year since 2003 , when the first Shanghai index came out, the ups and downs of universities, especially in East and Southeast Asia, have commanded almost as much attention as the fortunes of national football teams.
This year it seems that competition between the rankers, Times Higher Education and their former partners, QS, will be get as much attention as that between universities and a lot of that attention will go to the merits or flaws of the surveys that are now under way.
Times Higher have just announced the launching of the new reputational survey while QS have started a sign -up facility. If THE are going to start the survey now then they could create a problem for QS since after one e-mail message plus a few follow-ups (I expect Ipsos MORI will tell them about this) and, for some people, a form from the EU rankings, severe ranking fatigue will set in and the later survey forms will go unanswered.
Here are some points of comparison of the two main surveys that will be filling academic e-mail boxes in the next few weeks or months.
Indicator Weighting
QS have stated that their survey will continue to have a weighting of 40 percent. Times Higher say that theirs will have a smaller weighting but have not said exactly how small. Probably the reduction will not be too great if the expense and effort of conducting a survey is to be justified.
Participants
The bulk of QS's survey respondents have come from the mailing lists of World Scientific, a Singapore based publishing company that is linked with Imperial College London and has had a close relationship with Peking University. Others, mainly in the humanities and social sciences, have come from Mardev, a company that collects academic addresses. Some no doubt have been identified during QS's various seminars and tours. This year QS have added a sign up facility that will screen those who wish to take part.
THE will get most of their respondents from the Thomson Reuters internal database by which they presumably mean authors of papers in ISI-indexed journals and conference proceedings, supplemented by so far unidentified third party sources.
The basic qualification then for participating in the QS survey is therefore to subscribe to a newsletter from World Scientific. For the THE survey it will be to to have published a paper in a reputable academic journal or conference proceedings. The THE respondents should then be better qualified to comment on research quality, although one might note that the assigning of the role of first or corresponding author is sometimes a political decision rather than a recognition of actual contributions to a research project.
Numbers
THE have said that they are aiming at a target of 25,000 participants. QS appear to be aiming at close to 17,000 this year.
Regional and Disciplinary Balance
QS have stated that they weight by discipline and subject when selecting potential respondents from the World Scientific and Mardev databases. After data collection they balance responses between three super- egions, the Americas, Asia and the Pacific, and Africa, Europe and the Middle east, but not apparently within those regions. THE have stated they will distribute the survey forms to reflect the world distribution of academic researchers geographically and in terms of discipline.
Questions
THE have stated that they will be asking questions about teaching and research and that the questions about research will be more focused than in the past. QS will continue to ask only about research, which is a little odd since their respondents probably include many who teach but do not do research.
Languages
Last year the THE- QS forms could be answered in English or Spanish. QS may be including other language options this year. So far, it looks as though the THE forms will be entirely in English.
General
It appears that THE may produce a valid survey of the opinion of recently published researchers that reflects the current global distribution of academic research activity. The main problem may well be that there will be a serious conflict between quantity and quality. Academic e-mail addresses are highly degradable and THE may find that many of their published researchers have retired, been downsized, moved, died, forgotten their password or just got fed up with filling out online survey forms. If, in pursuit of the targeted 25,000, they are forced to start trying to contact scientists who published an article (or just put their names on the work of graduate students) several years ago the validity of the survey may become questionable.
On the other hand, it would seem an error for QS to insist on continuing to ask only about research. The THE-QS survey was a dubious measure of research performance but it might have more credibility if it also measured teaching quality or social and economic contributions.
On balance, it would seem that THE, if it can get the the number of respondents it needs, will produce a more accurate and credible survey of opinion about research, although QS might claim that by reaching out to university teachers and non-English speakers they are providing a platform for those whose views ought to be considered in any opinion survey.
Saturday, March 06, 2010
Times Higher has a leader by Ann Mroz on the rising tide of academic bureaucracy, to which, we might add, rankings, ratings and assessment have made no small contribution.
"But banal and mind-numbing though it is, bureaucracy isn't neutral. It is insidious, changing the nature of both teaching and research; it also, of course, has been used to push academics in uncomfortable directions.A scary new word to emerge in our cover story is "hyper-bureaucracy", which describes "an out-of-control system" that emerges in the search for optimum efficiency and takes no account of the costs in time, energy and money that are needed to achieve it. It is a bureaucratic nightmare in which there is no end to the extra information that can be acquired. The monitoring of contact hours and how academics spend their time are examples of the type of bureaucracy that "eats up people and resources", according to Andrew Oswald, professor of economics at the University of Warwick. "
Thursday, March 04, 2010
Phil Baty in today's Times Higher Education explains why the rankings need an overhaul despite their growing influence.
"So if the rankings have become an accepted reference point, why are we making such dramatic changes, switching our data provider and revamping our methodology? We are doing so precisely because the rankings have become such a respected reference point. If they are starting to influence strategic thinking and even government policy, we have a responsibility to make them as rigorous as possible."
Wednesday, March 03, 2010
Saturday, February 27, 2010
Phil Baty in Times Higher Education writes about fluctuations in the old THE-QS rankings
"Magazines that compile league tables have an interest in instability - playing around with their methodologies to ensure rankings remain newsworthy.
This was the argument made by Alice Gast, president of Lehigh University, Pennsylvania, at the Lord Dearing memorial conference at the University of Nottingham this month.
She has a point. Dramatic movements in the league tables make the news and generate interest - helpful for the circulation figures.
But too much movement raises questions about credibility: everyone knows that it takes more than 12 months for an 800-year-old university to lose its status, or for a young pretender to ascend the heights. "
The THE-QS rankings were famous for their yearly fluctuations. This of course helped to make them much more popular than the reliable but boring Shanghai rankings (unless you were prepared to spend a few hours cutting and pasting the indicator scores of universities in the 300s and 400s into an Excel file and then they could be interesting). The rises and falls resulted from changes in methodology, errors, correction of errors and inconsistent application of guidelines.
Still, there are cases when universities undergo serious restructuring or pour massive funds into research or recruit administrators of the highest calibre and these developments should be reflected in any valid index. Rankings that do not show some upward movement by, say, the Hong Kong University of Science and Technology or King Abdullah University of Science and Technology ought to be considered suspect
Equally, it is striking that the major rankings contain elements, the THE-QS academic opinion survey, the Nobel laureates in the Shanghai rankings, even eleven year old publications and citations in the Taiwan rankings, that disguise the steady relative decline of Oxford and Cambridge over the last two decades.
We shall have to wait until 2011 to see if the new THE ranking will avoid the suspicious fluctuations of the THE-QS rankings and also be sensitive to genuine changes in international higher education.
Wednesday, February 24, 2010
Richard Vedder of the Center for College Affordability and Productivity produces an interesting US ranking based on value for money for students. One key element is data provided from the famous or notorious site, RateMyProfessors. I used to think that it would a step forward in international university rankings to try to do something like this on a global scale. Now I am not so sure.
I assume that everbody has heard of the tragic shooting at the University of Alabama at Huntsville. My first suspicion was that the alleged murderer, Amy Bishop, was a talented but socially awkward academic who had snapped after being denied tenure on flimsy grounds of collegiality or for being politically incorrect.
That does not look like being the case. A blog, Shepherds and Black Sheep has analysed her research output and found that it seems inflated, with one article having her children as c0-authors and another published in an "online vanity press' and several more being co-authored with her husband
Bishop's pages at RateMyProfessors are also interesting. At first sight they look quite impressive with a total score of 3.6 out of 5, fifth best in her faculty, and 3.4 for clarity and 3.7 for helpfulness. But there are some oddities.
The user comments start with three excellent reviews, one on 26 May, 2009 and two on 19 May. A little odd. Back in June 2004 there were also two rave reviews again posted on the same day.
It is also noticeable that the good reviews tend to cluster together with three consecutve good reviews in January and February 2006 and another three, one after the other, in November, 2004 and January, 2005.
Another odd thing is that for the helpfulness and clarity indicators there is a very distinctive distribution curve. For helpfulness, Bishop had 6 ones (the worse), 2 twos , 5 threes, 3 fours, and then 17 fives (the best) For clarity, it was 7 ones, 5 twos, 4 threes, 3 fours and 15 fives. Note the dramatic jump from four to five in both categories.
Compare this with a low scoring teacher who gets 4 fours and 4 fives for helpfulness and 5 fours and 6 fives for clarity.
Compare also a high scoring faculty member with 13 fours and 25 fives for helpfulness and 15 fours and 28 fives for clarity. A big jump from four to five but proportionately much less than Bishop's.
Is it possible to rig RateMyProfessors?
According to Tenured Radical it is very easy.
"To test my theory that ratings could be posted by people who had never been my students, I went to the dreaded site, and registered myself, under my own name, as a Zenith student. Easy-peasy. The only false information I provided was a birth date that made me 19 years old (I wish!) and the box I checked that affirmed my status as a Zenith sophomore. I then successfully added a rating about myself. You can see it here: it's the anxious looking green emoticon that has the comment "interesting." I thought it only fair to add something right down the middle, neither good nor bad. Inflammatory perhaps, but arrogant never, that's my motto. "
So, I have a strong suspicion that someone had been going to RateMyProfessors and posting effusive commuents about Bishop (and nasty ones about other faculty members.?)
Perhaps RateMyProfessors is not such as good indicator after all.
Tuesday, February 23, 2010
A survey was conducted recently for Thomson Reuters to provide input for the forthcoming Times Higher Education World University Rankings. The results of the survey can be accessed at the Global Institutional Profiles Project set up by Thomson Reuters.
The results of the survey are important since they might provide a clue to what the new ranking will look like.
There were 350 respondents from the “global academic community” This is apparently more than the numbers that answered similar surveys by QS but it does not seem very large especially when THE has raised justified concern about the low and possibly unrepresentative numbers participating in the THE-QS survey of academic opinion.
Of those 350, 107 were from the UK (31%), 90 from the US, 30 from Australia, nine from Canada and seven from New Zealand. Thirty three were from the rest of Europe, 32 from Asia and 42 from the others, i.e. not from North America, Europe, Asia and Australasia. With nearly a third of the respondents coming from the UK and over two thirds from just five English speaking countries this is a distinctly Anglo-Saxon-centric affair.
The first question was the level of familiarity with various rankings. The THE ranking was the one with which the largest number of respondents were familiar. This is a slightly odd result since from 2004 to 2009 THE published rankings under the name THE (S) - QS World University Rankings. The Times Higher Education World University Rankings (minus QS) have yet to appear. No doubt, THE would claim that it was they who published the ranking and can retrospectively rename them if they wish.
Summarizing the responses to the survey, it seems that the respondents believe that university rankings
• are useful
• have methodological problems
• are biased
• encourage the manipulation of data
• encourage a focus on numerical comparisons
• use data that is not transparent or reproducible
• do not not include appropriate metrics
• favour research institutions
Among the information that respondents need or would like to have are
• Publications and citations
• Research awards
• Patents
• Faculty student ratio
• Faculty activity ratios (teaching income/research grants/publications per staff)
• Number of faculty by gender, international, ethnicity or race
• Number of graduate programs and degrees
• Collaboration
• Community engagement
• Perceptions of researchers, employers, alumni and community
Some of this, community engagement for instance, is too vague to be useful. Other items contradict the stated objectives of the developing ranking system: including patents and research collaboration in a general ranking would add more bias in favour of the natural and applied sciences. Others betray the American or European concerns of the respondents: alumni have little significance outside the USA. It is also noticeable that nobody seems interested in student perceptions.
Friday, February 19, 2010
This, in an article by Phil Baty of THE in The Australian, sounds promising.
"So we've started again. For 2010, Thomson Reuters has hired pollsters Ipsos MORI to carry out the reputation survey, and it has committed to obtaining 25,000 responses, from a carefully targeted and properly sampled group that will represent the true demographics of global higher education.
This may mean a slip in the performance of British and Australian institutions but if that is the case then so be it. We are interested in getting closer to the truth. "
Click here for a survey by Nick Clark in the World Education News and Reviews
Click here for another presentation from the CHEA International Seminar by Angela Yong-chi Hou of the Higher Education Evaluation and Accereditaion Council of Taiwan.
Click here for a comprehensive and informative presentation by Robert Morse of US News at the CHEA International Seminar in Washington DC on America's best Colleges Rankings: A Brief History.
Phil Baty argues in yesterday's Times Higher Education that, despite the "jiggery-pokery" employed by some universities to get a better position in university rankings, "there is no need to sacrifice mission to position"
He refers to several cases of university administrators manipulating data to rise in the rankings. One example is Albion College in the USA who divided a small alumnus donation into smaller annual payments. Frankly, I wonder if this is worth getting worried about. Surely, a far greater scandal in American colleges is the admission, in order to please alumni and get money out of them, of large numbers of academically unqualified student athletes.
The article then discusses "the less dishonest but nevertheless deleterious effects of rankings, such as pressing staff to publish in English-language journals, which may lift an institution's profile but may not best serve its local community'
This is true but it should be noted that THE has shifted from using Scopus data to Thomson Reuters whose database has been criticised for its overwhlemingly English language content.
Baty is right on target when he comments on institutions' importing large numbers of foreign students in order to boost their score for the internationalistion score on the THE-QS rankings. There are though other reasons, mainly financial, for doing this. In the UK and Australia it is likely that in many cases this has contributed to a decline in quality.
Counting international students is rather different from counting international faculty. In most cases, students pay, or someone pays for them, to travel abroad to go to university but universities pay international faculty to come to them.
It would be a good idea if THE dropped the intenational student indicator. If they are going to keep it then one simple and helpful measure might be to include the showing of a passport in the definition of international. In other words treat the European Union, or at least the Schengen Area, as a single country.
Thursday, February 18, 2010
The answer is very good but there seem to be a few US universities that are better.
Tilburg University has just produced a new ranking of Economics schools based on publications (ISI indexed journals) in journals in Economics, Econometrics and Finance. Harvard is first with a score of 551 followed by Chicago (385). LSE is eighth alongside Northwestern University with a score of 280. Oxford is 22nd and University College London 29th. Tilburg is 23rd.
If LSE can only get to eighth place in Economics then what can we expect from an ojective ranking in the natural sciences and the arts and humanities?
Wednesday, February 17, 2010
The country share of vistors to this blog is as follows. Noticeably absent are China and Russia unless they are in 'unkown' (12%).
United States 22%
United Kingdom 8%
Switzerland 7%
Canada 4%
Malaysia 4%
Singapore 4%
Germany 3%
Nigeria 3%
France 3%
Indonesia 2%
India 2%
Poland 2%
Spain 2%
Mexico 1%
Czech Republic 1%
Greece 1%
Brunei 1%
Belgium 1%
Australia 1%
Ireland 1%
Japan 1%
Saturday, February 13, 2010
I recently came across a site called StrategicFIRST that ranks websites according to traffic and indicates an estimated value for the site. I am not sure how reliable it is but here are the data for some sites associated with international university ranking.
Estimated Value
Webometrics (webometrics.info) $97,281
QS Quacquarelli Symonds (topuniversities.com) $89,122
Scimargo (scimagojr.com) $86,528
Academic ranking of World Universities (aarwu.org) $79,545
Times Higher Education (timeshighereducation.co.uk) $79,132
HEEACT (heeact.edu.tw) $23,684
University Ranking Watch (rankingwatch.blogspot.com) $5,176
Global Universities Ranking [Russia]globaluniversitiesranking.org $3,941
Princeton Review (princetonreview.comcollege-rankings.aspx $3,802
There is a comment by Nunzio Quacquarelli on the QS topuniversities rankings blog.
Here is an extract:
"In October 2009, QS and THE ended their collaboration under which THE was licensed to publish the QS results known as “Times Higher Education (THE) – QS World University Rankings”. Since then, THE have announced they intend to produce their own rankings and have been systematically critical of QS’ methodology as part of their explanation for the split. This is surprising; THE consistently praised the QS methodology throughout the six-year publishing collaboration. Indeed, their former publishing director described it as one of the best partnerships in the history of THE.
Similarly, Ann Mroz, Editor of THE wrote in October 2008: "These rankings use an unprecedented amount of data to deliver the most accurate measure available of the world’s best universities, and of the strength of different nations’ university systems. They are important for governments wanting to gauge the progress of their education systems, and are used in planning by universities across the world."
Phil Baty, Associate Editor of THE wrote only on October 10 2009: “Congratulations on a highly successful campaign on the rankings again this year. The internet is buzzing.” Yet it seems our objectives and methodological principles have subsequently diverged. QS will continue to produce our rankings using citation data from the Scopus database of Elsevier. THE have decided to align themselves with Thomson Reuters’ academic citation database."
Thursday, February 11, 2010
The new Webometrics ranking is out.
Some interesting points
The top 20 are all in the USA.
The best non-US university is Cambridge at 27.
British universities do not do very well. Oxford is at 37, University College London at 57 and Imperial at 157 while Webometrics joins the anti-LSE conspiracy by putting it at 234.
The top European universities seem to be in the North -- Edinburgh, Oslo, Helsinki. Something about the cold weather?
Regional Rankings
Best in Latin America: Sao Paulo
Best in Europe: Cambridge
Best in Central and Eastern Europe: Charles University
Best in Asia: Tokyo
Best in South East Asia: National University of Singapore
Best in South Asia: Indian Institute of Technology Bombay
Best in the Arab World: King Saud University
Best in Oceania: Australian National University
Best in Africa: Cape Town
Finally Israeli universities should get a special award for mobility. They manage to be in Asia and Europe at the same time.
Tuesday, February 09, 2010
The rise of China to scientific superpower status has been well documented. See here for a report by Jonathon Adams, Christopher King and Nan Ma.
This can be confirmed by a simple search of the Scopus database which reveals 38,360 scientific publications from China in 1999 compared to 250,452 in 2009. For the United States the corresponding figures were 311,879 and 367,641.
The UK, France and Germany recorded modest increases over the decade while research output in Russia actually fell.
A certain amount of caution is in order. These figures refer to the quantity of research, not to its quality and China does have a large, although stable, population. Still, the West has cause to be concerned.
Some other countries have improved quite considerably over the decade. Korea, India, Australia and Hong Kong have doubled or nearly doubled and Thailand has more than tripled its research output.
It is especially noticeable that Malaysia is catching with Singapore. The former had 1,235 publications in 1999 and the latter 4,538 . In 2009 the figures were 7,834 and 10,993.
However, the prize f0r rapid growth goes to Iran which had 1,351 publications in 1999 and 19,088 in 2009. Compare this with Israel: 11,918 and 16,335.
If research in Iran goes on advancing at this rate and if other countries in the region also develop their scientific capabilities and if the ultra -orthodox extend their assault on reason and science into Israeli schools and universities, it looks as though Hamas and Hezbollah are going to the least of Israel's problems.
Friday, February 05, 2010
There has been a lot of discussion about university rankings recently. In Times Higher Education, Phil Baty refers to a comment in the satirical magazine Private Eye about the forthcoming European Union rankings. Why spend public money on the ranking of universities when there are already two recognised rankings? Perhaps, it has something to do with the striking absence of continental European universities from the upper reaches of the THE-QS and Shanghai rankings.
Baty claims to be less cynical than Private Eye. He says that:
"While I am sure CHERPA will strive to be fully independent, it is a group made up exclusively of European universities, and was set up in direct response to Europe's poor showing in the current rankings, so some suspicion is inevitable.
More serious, and entertaining, questions have been asked over other rankings. Russia's RatER raised eyebrows for putting Moscow State University in fifth place, ahead of Harvard and Cambridge, and a ranking from France's Mines ParisTech has been ridiculed for putting five French universities into the top 20."
However, one should not assume that the forthcoming THE rankings will be biased because
"these concerns give THE great confidence - as an independent magazine we are free from the influence of any institution or authority.
We are accountable only to our readers - an increasingly international community of thousands of academics and university administrators. "
But this raises certain questions. Is THE not accountable to the company that owns it? Another question is that "increasingly international" community. "Increasingly" from what to what? And who are those administrators responsible to?
The national bias of the Paris Mines ranking is indisputable. There the top French institution is in sixth place. In the most recent THE-QS rankings the top French institution was 38th, in the Russian RaTER rankinigs 36th, in the Shanghai Aacademic Ranking of World Universities 40th, in the Taiwan rankings 88th and in Webometrics 129th.
The bias of the Russian rankings is even more glaring. They put Moscow State University in 5th place. In no other ranking did they even get intio the top fifty.
I am not suggesting that there is anything dishonest about the Paris and Russian rankings. The Paris rankings is as transparent as it is possible to be. It simply counts the number of CEOs of top 500 companies who attended particular schools. Everything is in the public record. The Russian rankings are not so transparent. The problem here is that its questionnaire contains many references to indicators specific to Russia and the CIS. It is also written in a style that many people would find close to incomprehensible.
The bias in the Paris and Russian rankings stems not from dishonesty but from the choice of criteria that are likely to give an advantage to universities in their countries while downplaying or ignoring those in which their countries are not so strong.
In contrast, the Shanghai, Taiwan, Webometrics, and Scimargo rankings appear to have no home country bias at all.
What about THE? The old THE- QS rankings were pretty obviously biased in favour of British universities. Last year it had Cambridge in second place. The Shanghai rankings put it in 4th place, although that will not be sustained as the impact of old Nobel winners fades. In the Paris Mines ranking it was 7th, in the Russian rankings 8th, in the Taiwan rankings 15th, in Webometrics 22nd , in Scimargo 34th and in the Leiden green index (the size-independent, field-normalized average impact) 37th.
We will see if Cambridge and Imperial College maintain their suspiciously high places in the new THE rankings. If they start slipping a little I will be inclined to agree that THE has in fact overcome its anglocentric bias.