Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Friday, June 17, 2016
Dumbing Down at Oxbridge
The relentless levelling of British universities continues. The latest sign is a report from Oxford where the university is getting ready to crack down on colleges that make their students work too hard. Some of them apparently have to write as many as three essays a week and most work at least 40 hours a week, some longer, which is apparently twice as much as places like Northumbria University.
Many commentators have mocked the poor fragile students who cannot cope with with a fifty hour week. After all, that is nothing to what they can expect if they start legal, medical or research careers.
Something else that is a bit disturbing is that Oxford students apparently need so much time to do that amount of work. One would expect the admissions system at Oxford to select academically capable students who can do as little work as those at Northumbria and still perform much better. If Oxford students can only stay ahead by working so hard doesn't this mean that Oxford is failing to find the most intelligent students and has to make do with diligent mediocrities instead?
The villain of the piece is probably the abolition of the essay based Oxford entrance exam in 1995 (Cambridge abolished theirs in 1986) which threw the burden of selection onto A level grades and interviews. The subsequent wholesale inflation of A level grades has meant that an undue importance is now given to interviews which have been shown repeatedly to be of limited value as a selection tool, particularly at places like Oxbridge where the interviewers have sometimes been biased and eccentric.
So Oxford and Cambridge are now planning to reintroduce written admission tests. They had better do it quickly if they want their graduates to compete with the Gaokao-hardened students from the East.
Thursday, June 09, 2016
THE is coming to America
Times Higher Education (THE) has just announced that American university rankings are not fit for purpose.
We have heard that before. In 2009 THE said the same thing about the world rankings that they had published in partnership with the consulting firm Quacquarelli Symonds (QS) since 2004.
The subsequent history of THE's international rankings provides little evidence that the magazine is qualified to make such a claim.
The announcement of 2009 was followed by months of consultation with all sorts of experts and organisations. In the end the world rankings of 2010, powered by data from Thomson Reuters (TR), were not quite what anyone had expected. There was an increased dependence on self-submitted data, a reduced but still large emphasis on subjective surveys, and four different measures of income, reduced to three in 2011. Altogether there were 14 indicators, reduced to 13 in 2011, all but two of which were bundled into three super-indicators, making it difficult for anyone to figure exactly why any institution was falling or rising.
There were also some extraordinary elements in the 2010 rankings the most obvious of
which was placing Alexandria University in 4th place in the world for research impact
.
.
The rankings received a chorus of criticism
mixed with some faint praise for trying hard. Philip Altbach of Boston College summed up the whole affair pretty well.
“Some of the rankings are clearly inaccurate. Why do Bilkent University
in Turkey and the Hong Kong Baptist University rank ahead of Michigan State University,
the University of Stockholm, or Leiden University in Holland? Why is Alexandria
University ranked at all in the top 200? These anomalies, and others, simply do
not pass the smell test."
THE and TR returned to the
drawing board. They did some tweaking here and there and in 2011 got Alexandria
University out of the top 200 although more oddities would follow over the next
few years, usually associated with the citations indicator. Tokyo Metropolitan
University, Cadi Ayyad University of Marrakech, Santa Maria Federico Technical University,
Middle East Technical University and the University of the Andes were at one
point or another declared world class for research impact across the full range
of the disciplines.
Eventually the anomalies got too much and after breaking with TR in 2015 THE decided to have a bit of a spring cleaning and tidied things up a bit.
For many universities and countries the results of
the 2015 methodological changes were catastrophic. There was a massive churning with universities going up and down the tables. Universite Paris-Sud, the Korean Advanced Institute of Science and Technology, Bogazici University and The Middle East Technical
university fell scores of places.
THE claimed that this was an
improvement. If it was then the previous editions must have been hopelessly
inadequate. But if the previous rankings were the gold standard of rankings
then those methodological changes were surely nothing but gratuitous vandalism.
THE has also ventured into far away regions with snapshot or pilot rankings. The Middle East was treated to a ranking with a single indicator that put Texas A and M University Qatar, a
branch campus housing a single faculty, in first place. For Africa there was a ranking consisting of
data extracted from the world rankings without any modification of the indicators, which did not seem to impress anyone.
So one wonders where THE got the
chutzpah to tell the Americans that their rankings are not fit for purpose. After
all, US News was doing rankings for two decades before THE and their America's Best Colleges include metrics about retention and reputation as well as resources and
selectivity. Also, there are now several rankings that already deal directly with the concerns raised by THE.
The Forbes/CCAP rankings include measures of student satisfaction , degree of student indebtedness,
graduation on time, and career success.
The Brookings Institution has a value added ranking that includes data from the college scorecard
The Economist has produced a very interesting ranking that compares expected and actual value added.
So exactly what is THE proposing to do?
It seems that there will be a student engagement survey which apparently will be launched this
week and will cover 1,000
institutions. They will also use data on cost, graduation rates and salaries from the Integrated Postsecondary Data System (IPEDS) and the College Scorecard. Presumably they are looking for some way of monetising all of this so probably large chunks of the data will only be revealed as part of benchmarking or consultancy packages.
I suspect that the new
rankings will like something like the Guardian university league tables just published in the
UK but much bigger.
The Guardian rankings include
measures of student satisfaction, selectivity, spending,
staff student ratio and value added. The latter compares entry qualifications with the number of students getting good degrees (a first or upper second).
It seems that THE are planning something different from the research centred industry orientated university rankings that they have been doing so far and are venturing out into new territory, institutions that are two or three tiers below the elite and do little or no research.
It seems that THE are planning something different from the research centred industry orientated university rankings that they have been doing so far and are venturing out into new territory, institutions that are two or three tiers below the elite and do little or no research.
There could be a market for this kind of ranking but is very far from certain that THE are capable of doing it and whether it is financially feasible.
Tuesday, May 31, 2016
UK rises in the U21 system rankings
The comparison of national higher education systems by Universitas 21 shows that the UK has risen from 10th place to 4th since 2012.
These rankings consists of four groups of indicators: resources, connectivity, environment and output. Since 2012 the British higher education has risen from 27th to to 12th for resources, 13th to 10th for environment and 6th to 4th for connectivity. It was in second place for output in 2012 and in 2016 but its score rose from 62.2 to 69.9 over the four years.
Every few months, whenever any sort of ranking is published, there is an outcry from British universities that austerity and government demands and interference and immigration controls are ruining higher education.
If the U21 rankings have any validity then it would seem that British universities have been very generously funded in comparison to other countries.
Perhaps they could return some of the money or at least say thank you to the state that has been so kind to them.
These rankings consists of four groups of indicators: resources, connectivity, environment and output. Since 2012 the British higher education has risen from 27th to to 12th for resources, 13th to 10th for environment and 6th to 4th for connectivity. It was in second place for output in 2012 and in 2016 but its score rose from 62.2 to 69.9 over the four years.
Every few months, whenever any sort of ranking is published, there is an outcry from British universities that austerity and government demands and interference and immigration controls are ruining higher education.
If the U21 rankings have any validity then it would seem that British universities have been very generously funded in comparison to other countries.
Perhaps they could return some of the money or at least say thank you to the state that has been so kind to them.
Subscribe to:
Posts (Atom)