Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Sunday, June 19, 2016
Worth reading 6: The Berlin principles
Just heard about this from Gary Barron.
Barron, Gary R.S. 2016. "The Berlin Principles on Ranking Higher Education Institutions: limitations, legitimacy, and value conflict." Higher Education, Online First, pp.1-17.
Abstract
University rankings have been widely criticized and examined in terms of the environment they create for universities. In this paper I reverse the question by examining how ranking organizations have responded to criticisms. I contrast ranking values and evaluation with those practiced by academic communities. I argue that the business of ranking higher education institutions is not one that lends itself to isomorphism with scholarly values and evaluation and that this dissonance creates reputational risk for ranking organizations. I argue that such risk caused global ranking organizations to create the Berlin Principles on Ranking Higher Education Institutions, which I also demonstrate are decoupled from actual ranking practices. I argue that the Berlin Principles can be best regarded as a legitimizing practice to institutionalize rankings and symbolically align them with academic values and systems of evaluation in the face of criticism. Finally, I argue that despite dissonance between ranking and academic evaluation there is still enough similarity that choosing to adopt rankings as a strategy to distinguish one's institution can be regarded as a legitimate option for universities.
Dot Connection Time
Singapore-based World Scientific Publishing, whose subscription lists were used to collect names for the QS academic opinion survey, are advertising a new book, Top the IELTS: Opening the Gates to Top QS-Ranked Universities. by Kaiwen Leong of Nanyang Technological University and Elaine Leong.
Nanyang Technological University is ranked 13th in the QS world rankings, ahead of Yale, Johns Hopkins and King's College London, and third in the Asian rankings.
World Scientific owns Imperial College Press.
Imperial College is eighth in the QS world rankings, ahead of Chicago and Princeton.
Friday, June 17, 2016
Dumbing Down at Oxbridge
The relentless levelling of British universities continues. The latest sign is a report from Oxford where the university is getting ready to crack down on colleges that make their students work too hard. Some of them apparently have to write as many as three essays a week and most work at least 40 hours a week, some longer, which is apparently twice as much as places like Northumbria University.
Many commentators have mocked the poor fragile students who cannot cope with with a fifty hour week. After all, that is nothing to what they can expect if they start legal, medical or research careers.
Something else that is a bit disturbing is that Oxford students apparently need so much time to do that amount of work. One would expect the admissions system at Oxford to select academically capable students who can do as little work as those at Northumbria and still perform much better. If Oxford students can only stay ahead by working so hard doesn't this mean that Oxford is failing to find the most intelligent students and has to make do with diligent mediocrities instead?
The villain of the piece is probably the abolition of the essay based Oxford entrance exam in 1995 (Cambridge abolished theirs in 1986) which threw the burden of selection onto A level grades and interviews. The subsequent wholesale inflation of A level grades has meant that an undue importance is now given to interviews which have been shown repeatedly to be of limited value as a selection tool, particularly at places like Oxbridge where the interviewers have sometimes been biased and eccentric.
So Oxford and Cambridge are now planning to reintroduce written admission tests. They had better do it quickly if they want their graduates to compete with the Gaokao-hardened students from the East.
Thursday, June 09, 2016
THE is coming to America
Times Higher Education (THE) has just announced that American university rankings are not fit for purpose.
We have heard that before. In 2009 THE said the same thing about the world rankings that they had published in partnership with the consulting firm Quacquarelli Symonds (QS) since 2004.
The subsequent history of THE's international rankings provides little evidence that the magazine is qualified to make such a claim.
The announcement of 2009 was followed by months of consultation with all sorts of experts and organisations. In the end the world rankings of 2010, powered by data from Thomson Reuters (TR), were not quite what anyone had expected. There was an increased dependence on self-submitted data, a reduced but still large emphasis on subjective surveys, and four different measures of income, reduced to three in 2011. Altogether there were 14 indicators, reduced to 13 in 2011, all but two of which were bundled into three super-indicators, making it difficult for anyone to figure exactly why any institution was falling or rising.
There were also some extraordinary elements in the 2010 rankings the most obvious of
which was placing Alexandria University in 4th place in the world for research impact
.
.
The rankings received a chorus of criticism
mixed with some faint praise for trying hard. Philip Altbach of Boston College summed up the whole affair pretty well.
“Some of the rankings are clearly inaccurate. Why do Bilkent University
in Turkey and the Hong Kong Baptist University rank ahead of Michigan State University,
the University of Stockholm, or Leiden University in Holland? Why is Alexandria
University ranked at all in the top 200? These anomalies, and others, simply do
not pass the smell test."
THE and TR returned to the
drawing board. They did some tweaking here and there and in 2011 got Alexandria
University out of the top 200 although more oddities would follow over the next
few years, usually associated with the citations indicator. Tokyo Metropolitan
University, Cadi Ayyad University of Marrakech, Santa Maria Federico Technical University,
Middle East Technical University and the University of the Andes were at one
point or another declared world class for research impact across the full range
of the disciplines.
Eventually the anomalies got too much and after breaking with TR in 2015 THE decided to have a bit of a spring cleaning and tidied things up a bit.
For many universities and countries the results of
the 2015 methodological changes were catastrophic. There was a massive churning with universities going up and down the tables. Universite Paris-Sud, the Korean Advanced Institute of Science and Technology, Bogazici University and The Middle East Technical
university fell scores of places.
THE claimed that this was an
improvement. If it was then the previous editions must have been hopelessly
inadequate. But if the previous rankings were the gold standard of rankings
then those methodological changes were surely nothing but gratuitous vandalism.
THE has also ventured into far away regions with snapshot or pilot rankings. The Middle East was treated to a ranking with a single indicator that put Texas A and M University Qatar, a
branch campus housing a single faculty, in first place. For Africa there was a ranking consisting of
data extracted from the world rankings without any modification of the indicators, which did not seem to impress anyone.
So one wonders where THE got the
chutzpah to tell the Americans that their rankings are not fit for purpose. After
all, US News was doing rankings for two decades before THE and their America's Best Colleges include metrics about retention and reputation as well as resources and
selectivity. Also, there are now several rankings that already deal directly with the concerns raised by THE.
The Forbes/CCAP rankings include measures of student satisfaction , degree of student indebtedness,
graduation on time, and career success.
The Brookings Institution has a value added ranking that includes data from the college scorecard
The Economist has produced a very interesting ranking that compares expected and actual value added.
So exactly what is THE proposing to do?
It seems that there will be a student engagement survey which apparently will be launched this
week and will cover 1,000
institutions. They will also use data on cost, graduation rates and salaries from the Integrated Postsecondary Data System (IPEDS) and the College Scorecard. Presumably they are looking for some way of monetising all of this so probably large chunks of the data will only be revealed as part of benchmarking or consultancy packages.
I suspect that the new
rankings will like something like the Guardian university league tables just published in the
UK but much bigger.
The Guardian rankings include
measures of student satisfaction, selectivity, spending,
staff student ratio and value added. The latter compares entry qualifications with the number of students getting good degrees (a first or upper second).
It seems that THE are planning something different from the research centred industry orientated university rankings that they have been doing so far and are venturing out into new territory, institutions that are two or three tiers below the elite and do little or no research.
It seems that THE are planning something different from the research centred industry orientated university rankings that they have been doing so far and are venturing out into new territory, institutions that are two or three tiers below the elite and do little or no research.
There could be a market for this kind of ranking but is very far from certain that THE are capable of doing it and whether it is financially feasible.
Subscribe to:
Comments (Atom)


