Observation on the THE Ranking
There will be some comments on the latest Times Higher Education World University Rankings over the next few days.
For the moment, I would just like to point to one noticeable feature of the rankings. The scores have improved across the board.
In 2011 the top university (Caltech) had an overall score of 94.8. This year it was 95.5.
In 2011 the 50th ranked university had a score of 64.9. This year it was 69.4.
In 2011 the 100th ranked university had a score of 53.7. This year it was 57.5 for the universities jointly ranked 99th..
In 2011 the 150th ranked university had a score of 46.7. This year it was 51.6.
In 2011 the 200th ranked university had a score of 41.4. This year it was 46.2.
The overall score is a combination of 13 different indicators all of which are benchmarked against the highest scorer in each category, which receives a score of 100. Even if universities throughout the world were spending more money, improving staff - student ratios, producing more articles, generating more citations and so on, this would not in itself raise everbody's, or nearly everybody's score.
There are no methodological changes this year that might explain what happened.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Friday, October 05, 2012
Wednesday, October 03, 2012
Dumbing Down Watch
Is any comment needed?
Is any comment needed?
Students starting university for the first time this autumn will be given a
detailed breakdown of their academic achievements, exam results,
extra-curricular activities and work placements, it was revealed.
More than half of universities in Britain will issue the new "Higher
Education Achievement Report", with plans for others to adopt it in the future.
University leaders said the document would initially list students’
overarching degree classification.
But Prof Sir Robert Burgess, vice-chancellor of Leicester University and
chairman of a working group set up to drive the reforms, said it was hoped that
first, second and third-class degrees would eventually be phased out altogether.
University Migration Alert
At the time of posting, The new THE rankings listed Stellenbosch University in South Africa as one of the top European universities.
Remind me to apply for a job with Thomson Reuters sometime.
At the time of posting, The new THE rankings listed Stellenbosch University in South Africa as one of the top European universities.
Remind me to apply for a job with Thomson Reuters sometime.
THE Rankings Out
The Times Higher Education World University Rankings 2012 are out. The top ten are :
1. Caltech (same as last year)
2. Oxford (up 2 places)
3. Stanford (down 1)
4. Harvard (down 2)
5. MIT (up 2)
6. Princeton (down 1)
7. Cambridge (down 1)
8. Imperial College London (same)
9. Berekeley (up 1)
At the top, the most important change is that Oxford has moved up two places to replace Harvard in the number two spot.
The Times Higher Education World University Rankings 2012 are out. The top ten are :
1. Caltech (same as last year)
2. Oxford (up 2 places)
3. Stanford (down 1)
4. Harvard (down 2)
5. MIT (up 2)
6. Princeton (down 1)
7. Cambridge (down 1)
8. Imperial College London (same)
9. Berekeley (up 1)
At the top, the most important change is that Oxford has moved up two places to replace Harvard in the number two spot.
Monday, October 01, 2012
Well, they would, wouldn't they?
Times Higher Education has published the results of a survey by IDP, a student recruitment agency:
At the top of the page is a banner about IDP being proudly associated with the THE rankings. Also, IDP, which is in the student recruitment trade, is a direct competitor of QS.
The data could be interpreted differently. More respondents were aware of the THE rankings and had not used them than knew of the QS rankings and had not used them.
Times Higher Education has published the results of a survey by IDP, a student recruitment agency:
The international student recruitment agency IDP asked globally mobile students which of the university ranking systems they were aware of. The Times Higher Education World University Rankings attracted more responses than any other ranking - some 67 per cent.This was some way ahead of any others. Rankings produced by the careers information company Quacquarelli Symonds (QS) garnered 50 per cent of responses, and the Shanghai Academic Rankings of World Universities (ARWU) received 15.8 per cent.Asked which of the global rankings they had used when choosing which institution to study at, 49 per cent of students named the THE World University Rankings, compared to 37 per cent who named QS and 6.7 per cent who named the ARWU and the Webometrics ranking published by the Spanish Cybermetrics Lab, a research group of the Consejo Superior de Investigaciones CientÃficas (CSIC).
At the top of the page is a banner about IDP being proudly associated with the THE rankings. Also, IDP, which is in the student recruitment trade, is a direct competitor of QS.
The data could be interpreted differently. More respondents were aware of the THE rankings and had not used them than knew of the QS rankings and had not used them.
Saturday, September 29, 2012
Forgive me for being pedantic...
My respect for American conservatism took a deep plunge when I read this in an otherwise enjoyable review by Matthew Walther of Kingsley Amis's Lucky Jim:
It is well known, or ought to be, that the institution in the novel was based on University College, Leicester, which is a long way from Wales. The bit about the "Honours class over the road", a reference to the Welford Road municipal cemetery, is a dead giveaway.
Walther can be forgiven though since he reminded me of this description of Lucky Jim's history article;
:
My respect for American conservatism took a deep plunge when I read this in an otherwise enjoyable review by Matthew Walther of Kingsley Amis's Lucky Jim:
Its eponymous hero, Jim Dixon, is a junior lecturer in history at an undistinguished Welsh college. Dixon’s pleasures are simple: he smokes a carefully allotted number of cigarettes each day and drinks a rather less measured amount of beer most nights at pubs. His single goal is to coast successfully through his two-year probation period and become a permanent faculty member in the history department.
It is well known, or ought to be, that the institution in the novel was based on University College, Leicester, which is a long way from Wales. The bit about the "Honours class over the road", a reference to the Welford Road municipal cemetery, is a dead giveaway.
Walther can be forgiven though since he reminded me of this description of Lucky Jim's history article;
“It was a perfect title, in that it crystallized the article’s niggling mindlessness, its funereal parade of yawn-enforcing facts, the pseudo-light it threw upon non-problems. Dixon had read, or begun to read, dozens like it, but his own seemed worse than most in its air of being convinced of its own usefulness and significance.”
:
Dumbing Down Watch
The New York Fire Department has announced the results of a new entrance exam. The passmark of 70 was reached by 95.72% of applicants. Previous tests had been thrown out because insufficient numbers of African-Americans and Hispanics were able to pass.
The new exam appears to be extremely easy and seems to assume that firefighting is a job that requires minimal intelligence. Effectively, the new policy for the New York Fire Department is to select at random from those able to get themselves to a testing center and answer questions that should pose no challenge to the average junior high school student.
The change was in response to the directives of Judge Nichols Garaufis, a graduate of Columbia Law School, which would seem to be as lacking in diversity as the NYFD. One suspects that the judge's disdain for the skills and knowledge, not to mention physical courage, of the firefighters is rooted in blatant class prejudice.
When is someone going to file a disparate impact suit against the top law schools?
The New York Fire Department has announced the results of a new entrance exam. The passmark of 70 was reached by 95.72% of applicants. Previous tests had been thrown out because insufficient numbers of African-Americans and Hispanics were able to pass.
The new exam appears to be extremely easy and seems to assume that firefighting is a job that requires minimal intelligence. Effectively, the new policy for the New York Fire Department is to select at random from those able to get themselves to a testing center and answer questions that should pose no challenge to the average junior high school student.
The change was in response to the directives of Judge Nichols Garaufis, a graduate of Columbia Law School, which would seem to be as lacking in diversity as the NYFD. One suspects that the judge's disdain for the skills and knowledge, not to mention physical courage, of the firefighters is rooted in blatant class prejudice.
When is someone going to file a disparate impact suit against the top law schools?
Dumbing Down Watch
David Cameron, Old Etonian and Oxford graduate, apparently does not know the meaning of Magna Carta.
But, looking at the reports of the Letterman interview, he did not actually say he didn't know.
So, perhaps he was just pretending to be dumb.
David Cameron, Old Etonian and Oxford graduate, apparently does not know the meaning of Magna Carta.
But, looking at the reports of the Letterman interview, he did not actually say he didn't know.
So, perhaps he was just pretending to be dumb.
Tuesday, September 25, 2012
Are Indian Universities Really Good at English?
Indian universities have never done well in any of the international university rankings. The problem is that although there are many talented Indian students and researchers they seem, like Indian entrepreneurs, to leave India as soon as they can.
But some observers have been taking comfort from a good showing in QS's subject rankings for English Language and Literature, which actually consists largely of the 2011 academic survey. One observer notes:
The Times of India comments:
The QS subject rankings for English are 90 per cent based on the academic survey so research and publications had nothing to do with it. My suspicion is that graduates of Delhi University have fanned out across the world for postgraduate studies and have signed up for the QS survey giving an American, British or whatever university as their affiliation. When they fill out the form, they probably put the Ivy League and Oxbridge down first and then Delhi university after about 20 or 30 other names. The QS methodology does not take account of the order of the responses so Harvard would get the same weighting as Delhi or much less if the respondent gave an American affiliation making Harvard a domestic vote and Delhi an international one.
Indian universities have never done well in any of the international university rankings. The problem is that although there are many talented Indian students and researchers they seem, like Indian entrepreneurs, to leave India as soon as they can.
But some observers have been taking comfort from a good showing in QS's subject rankings for English Language and Literature, which actually consists largely of the 2011 academic survey. One observer notes:
But DU's English department has made the day special as it figured in the top 100 list. Consequently, the celebratory mood was not restricted to North Campus alone but spilled over to South and off-campus colleges. Smiles could be observed on the faces of teachers as well as their pupils as their hard work paid off. "It is always pleasing to know you have done well at international level," said Ratnakar Kumar from Khalsa College. Another student from Hansraj added jokingly, "So we haven't been reading just stories because they don't fetch you such accolades." Verily, students were high on emotions.
A number of reasons can be attributed for the success that Department of English at Delhi University has witnessed. The significant factor is that students are encouraged and pushed to think outside the box, making bizarre to impressive attempts in their process of learning. So in a country in which rote-learning is the norm, DUDE has adopted a strategy to keep such evil at bay. Beside, the world-class faculty has also contributed to the improving standards of English Department.
The Times of India comments:
Teachers cite a number of reasons for the success of DU's English department. "First, it's the profile of the department in terms of research and publications. We are on top in subaltern studies, in post-colonial studies. Then, the numbers — we have 600 MA students, of whom 10-15 are as good as anybody," says a professor. He adds that India is not considered modern for technology but for the ideas of democracy and freedom, and those belong to the domain of humanities. The department is a part of the University Grants Commission's Special Assistance Programme.
The QS subject rankings for English are 90 per cent based on the academic survey so research and publications had nothing to do with it. My suspicion is that graduates of Delhi University have fanned out across the world for postgraduate studies and have signed up for the QS survey giving an American, British or whatever university as their affiliation. When they fill out the form, they probably put the Ivy League and Oxbridge down first and then Delhi university after about 20 or 30 other names. The QS methodology does not take account of the order of the responses so Harvard would get the same weighting as Delhi or much less if the respondent gave an American affiliation making Harvard a domestic vote and Delhi an international one.
Merging Universities
There is talk of merging Trinity College Dublin and University College Dublin. I am not sure whether there would be any benefit for faculty or students but inevitably there is a ranking angle.
There is talk of merging Trinity College Dublin and University College Dublin. I am not sure whether there would be any benefit for faculty or students but inevitably there is a ranking angle.
The report says a UCD-TCD merger would give the merged college the critical mass and expertise needed to secure a place among the world’s best-ranked universities. At present, Ireland is not represented among the top 100 universities in the prestigious Times Higher Education World Reputation Ranking.
Monday, September 24, 2012
Are All Disciplines Equal?
Rankers, evaluators and assessors of all sorts have to face the problem that academic disciplines do not go about writing, publishing and citing in exactly the same way. Researchers in some disciplines write shorter papers that have more authors, are more easily divided into smaller units and get more citations sooner after publication than others. Medical research papers tend to be short, frequent, co-authored and cited very soon after publication compared to history or philosophy.
Different ranking organisations follow different approaches when dealing with this diversity of practice. Scimago just counts the total number of publications. ARWU of Shanghai counts publications but gives a double weighting to social sciences and none to the humanities. Thomson Reuters, who power the Times Higher Education world rankings, normalize by field and by country.
Here are some fairly simple things that rankers might try to overcome disparities between disciplines. They could count total pages rather than the number of papers. They could look into counting citations of conference papers or books. Something worth doing might be giving a reduced weighting to co-authored papers, which would shift the balance a little bit towards the arts and humanities and might also help to discourage dubious practices like supervisors adding their names to papers written by their graduate students.
We should also ask whether there are limits to how far field and country normalization should go. Do we really believe that someone who has received an average number of citations for political science in Belarus deserves the same score as someone with an average number of citations for chemical engineering in Germany?
It does seem that there are substantial and significant variations in the cognitive skills required to compete effectively in the academic arena. Here are combined verbal and quantitative GRE scores for selected disciplines by intended majors for 2011-2012.
Physics and Astronomy 317
Economics 313
Biology 308
Philosophy 310
English Language and Literature 305
Education: Secondary 305
History 304
Education: Higher 304
Psychology 300
Sociology 300
Education: Administration 299
Education: Curriculum 299
Education: Early Childhood 294
The scores look as they are very close together but this is largely a a consequence, perhaps intended, of the revision (dumbing down?) of the GRE.
Is it possible that one reason why physicists and economists publish more papers which are read more than those by education specialists is simply that the former have more ability and interest than the latter?
Friday, September 21, 2012
Dumbing Down Watch
There is a "cheating" scandal at Harvard. Apparently, students were given an open-book, open-anything take-home exam for 'Introduction to Congress' and were not expected to consult each other in any way.
Harvard College’s disciplinary board is investigating nearly half of the 279 students who enrolled in Government 1310: “Introduction to Congress” last spring for allegedly plagiarizing answers or inappropriately collaborating on the class’ final take-home exam.
Dean of Undergraduate Education Jay M. Harris said the magnitude of the case was “unprecedented in anyone’s living memory.”
Harris declined to name the course, but several students familiar with the investigation confirmed that Professor Matthew B. Platt's spring government lecture course was the class in question.
The professor of the course brought the case to the Administrative Board in May after noticing similarities in 10 to 20 exams, Harris said. During the summer, the Ad Board conducted a review of all final exams submitted for the course and found about 125 of them to be suspicious.
Presumably this is not the only take-home exam in Harvard and presumably not the first for this course. So why now has half the class felt compelled to plagiarise or to collaborate inappropriately?
Has the course become more difficult than it used to be? Or are the students less capable? Or have admission standards become less rigorous?
Maybe QS and Times Higher were on to something after all when they dethroned Harvard from the number one spot in their world rankings.
Wednesday, September 19, 2012
New Canadian Research Rankings
Higher Education Strategy Associates recently published their Canadian Research Rankings, which are based on the award of grants and H-indexes. I am not sure about counting grants since it is likely that the skills needed to lobby for grants and those needed to actually do research are not always the same.
The rankings do not include medical research.
The top five for science and engineering are:
1. University of British Columbia
2. Montreal
3. Toronto -- St. George
4. Ottawa
5. McGill
The top five for social sciences and humanities are:
1. University of British Columbia
2. Mcgill
3. Toronto -- St George
4. Alberta
5. Guelph
These rankings, like the Times Higher Education (THE) World University Rankings are based on field normalisation. In other words they do not simple count the number of grants and H-index scores but compare them with the average for the field. The rationale for this is that there are enormous differences between disciplines so that it would, for example, be unfair to compare a physicist who has won a grant of 10,000 dollars, which is below average for physics, with an education researcher who has won a similar award, which is above average for education. Equally, does it make sense to rank a physicist with an average H-index for physics well above a linguist with an average one for linguistics?
Here are the average grants for various fields:
biological engineering 84,327
physics 42,913
linguistics 13,147
history 6,417
education 5,733
Here are the average H-indexes for discipline clusters:
science 10.6
social sciences 5.2
humanities 2.3
HESA (and THE ) do have a point. But there are problems. One is that as we drill down to smaller units of analysis there is a greater risk of outliers. So a single large grant or a single much cited author in a field with few grants or citations could have a disproportionate impact.
The other is that field normalisation implies that all disciplines are equal. But is that in fact the case?
Higher Education Strategy Associates recently published their Canadian Research Rankings, which are based on the award of grants and H-indexes. I am not sure about counting grants since it is likely that the skills needed to lobby for grants and those needed to actually do research are not always the same.
The rankings do not include medical research.
The top five for science and engineering are:
1. University of British Columbia
2. Montreal
3. Toronto -- St. George
4. Ottawa
5. McGill
The top five for social sciences and humanities are:
1. University of British Columbia
2. Mcgill
3. Toronto -- St George
4. Alberta
5. Guelph
These rankings, like the Times Higher Education (THE) World University Rankings are based on field normalisation. In other words they do not simple count the number of grants and H-index scores but compare them with the average for the field. The rationale for this is that there are enormous differences between disciplines so that it would, for example, be unfair to compare a physicist who has won a grant of 10,000 dollars, which is below average for physics, with an education researcher who has won a similar award, which is above average for education. Equally, does it make sense to rank a physicist with an average H-index for physics well above a linguist with an average one for linguistics?
Here are the average grants for various fields:
biological engineering 84,327
physics 42,913
linguistics 13,147
history 6,417
education 5,733
Here are the average H-indexes for discipline clusters:
science 10.6
social sciences 5.2
humanities 2.3
HESA (and THE ) do have a point. But there are problems. One is that as we drill down to smaller units of analysis there is a greater risk of outliers. So a single large grant or a single much cited author in a field with few grants or citations could have a disproportionate impact.
The other is that field normalisation implies that all disciplines are equal. But is that in fact the case?
Thursday, September 13, 2012
A Bit More about Les Ebdon
We have noted that Les Ebdon of the UK government's Office of Fair Access has been
lecturing leading universities about the need to admit more students from state schools. If not, they might lose their world class
status and could also be fined and forced to charge much lower fees.
English universities will be expected to enrol thousands more undergraduates from working-class families and poor-performing state schools in return for the right to charge up to £9,000 in tuition fees, it emerged.
Prof Ebdon, newly-appointed director of the Office for Fair Access, suggested that one poor student should eventually be admitted for each candidate enlisted from the wealthiest 20 per cent of households.
Currently, the ratio stands at around one-to-seven, he said.Speaking as he took up his role this week, Prof Ebdon said the country’s best universities were “not going to stay world class in a very competitive world unless they have access to the full pool of talent”
So if Oxford admits 42% of UK applicants from independent schools and they all come from wealthy households and if some of the state school applicants also come from wealthy households so that altogether about half are from privileged backgrounds, then it would presumably have to stop admitting anyone from the middle 60%.
If Ebdon gets his way then it is quite easy to see what will happen. Independent schools will simply arrange for their pupils to do a year at a carefully selected state school after A levels or the schools will open campuses in Ireland (or post-referendum Scotland) so that they can go into the international category.
If Ebdon gets his way then it is quite easy to see what will happen. Independent schools will simply arrange for their pupils to do a year at a carefully selected state school after A levels or the schools will open campuses in Ireland (or post-referendum Scotland) so that they can go into the international category.
Ebdon was the Vice Chancellor of the University of Bedfordshire, formerly
the University of Luton, which some think is among the worse universities in England, This may be a little unfair: it is almost certainly a better
university than Luton Town is a football team.
I wonder whether the next move will be for Ebdon to appointed
sports fairness tsar so that he can start telling Manchester United and
Liverpool to recruit more players from posh post codes or with more than
than one GCSE. Otherwise they will, unlike Luton Town, lose their world class status.
Wednesday, September 12, 2012
US News Rankings
The latest US News rankings are out. The top six are:
1. Harvard
2. Princeton
3. Yale
4. Columbia
5. Chicago
6. MIT
Disappointing
Here is a comment from MIT News. There is nothing about whether MIT did in fact recruit a load of new international faculty between 2011 and 2012.
Here is a comment from MIT News. There is nothing about whether MIT did in fact recruit a load of new international faculty between 2011 and 2012.
For the first time, MIT has been ranked as the world’s top university in the QS World University Rankings. The No. 1 ranking moves the Institute up two spots from its third-place ranking last year; two years ago, MIT was ranked fifth.
The full 2012-13 rankings — published by Quacquarelli Symonds, an organization specializing in education and study abroad — can be found at http://www.topuniversities.com/. QS rankings are based on research quality, graduate employment, teaching quality, and an assessment of the global diversity of faculty and students.
MIT was also ranked the world’s top university in 11 of 28 disciplines ranked by QS, including all five in the “engineering and technology” category: computer science, chemical engineering, civil engineering, electrical engineering and mechanical engineering.
QS also ranked the Institute as the world’s best university in chemistry, economics and econometrics, linguistics, materials science, mathematics, and physics and astronomy.
The Institute ranked among the top five institutions worldwide in another five QS disciplines: accounting and finance (2), biological sciences (2), statistics and operational research (3), environmental sciences (4) and communication and media studies (5).
Rounding out the top five universities in the QS ranking were the University of Cambridge, Harvard University, University College London and the University of Oxford.
What happened to MIT and Cambridge?
QS now has a new world number one university. Massachusetts Institute of Technology (MIT) has replaced Cambridge and overtaken Harvard.
Unfortunately, this change probably means very little.
Overall the change was very slight. MIT rose from 99.21 to 100 while Cambridge fell from 100 to 99.8.
There was no change in the two surveys that account for half of the weighting. MIT and Cambridge both scored 100 for the academic and the employer surveys in 2011 and in 2012.
On the citations per faculty indicator Cambridge did quite a bit better this year, rising from 92.7 to 97 while MIT fell slightly from 99.6 to 99.3. This could mean that, compared to front-runner Caltech,Cambridge has produced more articles, had articles cited more often, increased its faculty numbers or that there was some combination of the three.
For faculty student ratio, Cambridge fell slightly while MIT's score remained the same. For international students both fell slightly.
What made the difference was the international faculty indicator. Cambridge's score went from 98.4 to 98.2 while MIT's rose from 50 to 86.4, which means 1.82 more points in the total ranking, more than enough to overcome Cambridge's improvement in citations and pull slightly ahead.
Having done some rapid switching between the ranking scores and university statistics, I would estimate that a score of 50 represents about 15% international faculty and a score of 86 about 30 %.
It is most unlikely that MIT has in one year recruited about 150 international faculty while getting rid of a similar number of American faculty. We would surely have heard about it. After all, even the allocation of office space at MIT makes national headlines. Even more so if they had boosted the total number of faculty.
International faculty is a notoriously difficult statistic for data collectors. "International" could mean anything from getting a degree abroad to being a temporary visiting scholar. QS are quite clear that they mean current national status but this may not always reach the branch campuses, institutes, departments and programs where data is born before starting the long painful journey to the world rankings.
I suspect that what happened in the case of MIT is that somebody somewhere told somebody somewhere that permanent residents should be counted as international or that faculty who forgot to fill out a form were moved into the international category or something like that.
All this draws attention to what may have been a major mistake by QS, that is configuring the surveys so that a large number of universities are squashed together at the top.For the academic survey, there are 11 universities with a score of 100 and another 17 with a score of 99 to 99.9. Consequently, differentiating between universities at the top depends largely on data about students and faculty submitted by institutions themselves. Even if they are totally scrupulous about finding and disseminating data there are all sorts of things that can can cause problems at each stage of the process.
I have not heard any official reaction yet from MIT. I believe that there are some people there who are quite good at counting things so maybe there will be a comment or an explanation soon
Tuesday, September 11, 2012
Reactions to the QS Rankings
NUS, NTU climb in global ranking of universities (Singapore)
UK universities take four of six top global rankings (UK)
Irish universities still struggling in world rankings (Ireland)
UM in the top 200 now (Malaysia)
McGill holds Top-20 place in QS university rankings (Canada)
NUS, NTU climb in global ranking of universities (Singapore)
UQ in world’s top 50 in QS rankings system (Australia)
UK universities take four of six top global rankings (UK)
Irish universities still struggling in world rankings (Ireland)
UM in the top 200 now (Malaysia)
McGill holds Top-20 place in QS university rankings (Canada)
Monday, September 10, 2012
University of Adelaide Falls out of the Top 100
News about the latest QS WUR is beginning to trickle out. This is from the Australian.
THE rest of the world is catching up with Australian universities, dragging down the sector's relative performance in the latest QS global rankings.
The league table, released this morning, shows Australia has one fewer institution in the QS top 100 for 2012-13, with the University of Adelaide sliding 10 places to 102.
News about the latest QS WUR is beginning to trickle out. This is from the Australian.
THE rest of the world is catching up with Australian universities, dragging down the sector's relative performance in the latest QS global rankings.
The league table, released this morning, shows Australia has one fewer institution in the QS top 100 for 2012-13, with the University of Adelaide sliding 10 places to 102.
Sunday, September 09, 2012
Will There be a New Number One?
One reason why QS and Times Higher Education get more publicity than the Shanghai ARWU, HEEACT and other rankings is that they periodically produce interesting surprises. Last year Caltech replaced Harvard as number one in the THE rankings and Tokyo overtook Hong Kong as the best Asian university. Two years ago Cambridge pushed Harvard aside at the top of the QS rankings.
Will there be another change this year?
There is an intriguing interview with Les Ebdon, the UK government's "university access tsar", in the Daily Telegraph. Ebdon claims that leading British universities are in danger of losing their world class status unless they start admitting more students from state schools who may be somewhat less academically qualified. Perhaps he knows something.
So if Cambridge slips and is replaced by Harvard, MIT or Yale as QS number one (if it is Oxford or Imperial QS will lose all credibility) we can expect comments that Cambridge should start listening to him before its too late.
I suspect that if there is a new number one it might have something to do with the QS employer review. Since this is a sign up survey and since the numbers are quite small it would not take many additional responses to push Harvard or MIT into first place.
With regard to THE, the problem there is that normalising everything by country, year and/or field is a potential source of instability. If there is a vigorous debate with lots of citations about an obscure article by a Harvard researcher in a little cited field it could dramatically boost the score on the citations indicator.
Getting a good score in the THE rankings also depends on what a school is being compared to. Last year, Hong Kong universities slumped because they were taken out of China (with low average scores) and classified as a separate country (with high average scores), so that their relative scores were lower. If they are put back in China they will go up this year and there will be a new number one Asian university.
So anybody want to bet on Harvard making a come back this year? Or Hong Kong regaining the top Asian spot from Tokyo in the THE rankings?
One reason why QS and Times Higher Education get more publicity than the Shanghai ARWU, HEEACT and other rankings is that they periodically produce interesting surprises. Last year Caltech replaced Harvard as number one in the THE rankings and Tokyo overtook Hong Kong as the best Asian university. Two years ago Cambridge pushed Harvard aside at the top of the QS rankings.
Will there be another change this year?
There is an intriguing interview with Les Ebdon, the UK government's "university access tsar", in the Daily Telegraph. Ebdon claims that leading British universities are in danger of losing their world class status unless they start admitting more students from state schools who may be somewhat less academically qualified. Perhaps he knows something.
So if Cambridge slips and is replaced by Harvard, MIT or Yale as QS number one (if it is Oxford or Imperial QS will lose all credibility) we can expect comments that Cambridge should start listening to him before its too late.
I suspect that if there is a new number one it might have something to do with the QS employer review. Since this is a sign up survey and since the numbers are quite small it would not take many additional responses to push Harvard or MIT into first place.
With regard to THE, the problem there is that normalising everything by country, year and/or field is a potential source of instability. If there is a vigorous debate with lots of citations about an obscure article by a Harvard researcher in a little cited field it could dramatically boost the score on the citations indicator.
Getting a good score in the THE rankings also depends on what a school is being compared to. Last year, Hong Kong universities slumped because they were taken out of China (with low average scores) and classified as a separate country (with high average scores), so that their relative scores were lower. If they are put back in China they will go up this year and there will be a new number one Asian university.
So anybody want to bet on Harvard making a come back this year? Or Hong Kong regaining the top Asian spot from Tokyo in the THE rankings?
Friday, September 07, 2012
Rating Rankings
A caustic comment from the University of Adelaide:
"University rankings would have to be the worst consumer ratings in the retail market. In no other area are customers so badly served by consumer ratings as in the global student market," said Professor Bebbington. "The international rankings must change, or student consumers worldwide will eventually stop using them.
"Next to buying a house, choosing a university education is for most students, the largest financial commitment they will ever make. A degree costs more than a car, but if consumer advice for cars was this poor, there would be uproar.
"Students the world over use rankings for advice on which particular teaching program, at what campus to enrol in. Most don't realise that many of the rankings scarcely measure teaching or the campus experience at all. They mostly measure research outcomes." For this reason, said Professor Bebbington, half the universities in the US, including some very outstanding institutions, remain unranked.
He went on to discuss the inconsistency of university ranking results against the quality of the learning experience. According to Professor Bebbington, such a contradiction should come as no surprise: "Anyone who knows exactly what the rankings actually measure knows they have little bearing on the quality of the education."
Another problem was an increasing number of ranking systems, each producing different results. "With some, we have seen universities shift 40 places in a year, simply because the methodology has changed, when the universities have barely changed at all," he said. "It leaves students and parents hopelessly confused."
The Jiao Tong rankings in particular favour natural sciences and scarcely reflect other fields according to Professor Bebbington. "Moreover, they assume all universities have research missions. Those dedicated to serving their State or region through teaching, rather than competing in the international research arena, may be outstanding places to study, but at present are unranked.
"What is needed is a ranking that offers different lists for undergraduate teaching programs, international research-focussed programs, and regionally focussed programs. We need a ranking that measures all disciplines and is not so focussed on hard science."
The University of Adelaide has been steadily improving in the Shanghai ARWU rankings, although not in the two indicators that measure current research of the highest quality, publications in Nature and Science and highly cited researchers. It also improved quite a bit in the QS rankings from 107th in 2006 to 92nd in 2011. One wonders what Professor Bebbington is complaining about. Has he seen next week's results?
Still he has a point. The ARWU rankings have a very indirect measure of teaching, alumni who have won Nobel and Fields awards, while QS uses a very blunt instrument, faculty student ratio. It is possible to do well on on this indicator by recruiting large numbers of research staff who never do any teaching at all.
Times Higher Education rankings director Phil Baty has a reply in the Australian.
As yet unpublished research by international student recruitment agency IDP will show university rankings remain one of the very top information sources for international students when choosing where to study.
If they are making such a major investment in their future, the overall reputation of the institution is paramount. The name on the degree certificate is a global passport to a lifelong career.
Broad composite rankings - even those that say nothing about teaching and learning - will always matter to students.
But that does not mean the rankers do not have to improve.
The Times Higher Education league table, due for publication on October 4, is the only ranking to take teaching seriously. We employ five indicators (worth 30 per cent) dedicated to offering real insight into the teaching environment, based on things such as a university's resources, staff-student ratio and undergraduate-postgraduate mix.
Times Higher Education has made genuine efforts to capture some factors that may have relevance to teaching. Their academic survey has a question about teaching although it is only about postgraduate teaching. The others are necessarily somewhat indirect, income per academic, doctorates awarded, ratio of doctoral students to undergraduates.
If THE are to improve their learning environment criterion one option might be a properly organised, vetted and verified survey, perhaps based on university email records, of undergraduate students.
Saturday, September 01, 2012
From Preschool to PISA
We still don't have a ranking of kindergartens although no doubt that will happen one day. But there is now a country ranking of early education prepared by the Economist Intelligence Unit for the Lien Foundation "a Singapore philanthropic house noted for its model of radical philanthropy" (Economist print edition, 30/06/12). Details can be found here.
The ranking combines four indicators, Availability, Affordability, Quality of the Preschool Environment and Social Context, "which examines how healthy and ready for school children are".
The top five are:
1. Finland
2. Sweden
3. Norway
4. UK
5. Belgium
The bottom five are:
41. Vietnam
42. China
43. Philippines
44. Indonesia
45. India
It is interesting to compare this with the ability of fifteen year old students as measured by the 2009 PISA report. Finland, which is top of the preschool ranking, also scores well on all three sections of the PISA rankings, Reading, Science and Maths.
However, the United Kingdom which is ranked 4th in the preschool rankings does no better than the OECD average in the PISA ranking for Reading and Maths although not too badly in Science.
At the bottom of the preschool rankings we find China and India. In the PISA plus ranking India was represented by just two states and they preformed miserably.
On the other hand, China whose preschool system is ranked 42nd out of 45, does very well, ranking first in all three sections, or rather Shanghai does very well.
A warning is needed here. Mainland China is represented in the PISA rankings only by Shanghai. The results would almost certainly be very different if they included the whole of China including its huge rural hinterland. It is likely that if all of China had been assessed its scores would have been something like Taiwan (Chinese Taipei), 495 instead of 556 for Reading, 543 instead of 600 for Maths and 520 instead of 575 for Science.
Even so, it is striking that the UK could have such highly ranked preschools and such modest secondary school achievement while China has lowly ranked preschools but such high scores at secondary level if we just consider Shanghai or scores similar to Taiwan's if we estimate what the nationwide level might be.
There are other apparent anomalies. For example, Singapore is only slightly better than Mexico on the preschool rankings but forges well ahead on the PISA rankings.
It could be that preschool education does not have much to do with long term achievement. Also, some of the criteria, such as how healthy children are, may not have much do with anything that a preschool does or can do. Nor should we forget the preschool ranking deals with input while the PISA rankings are about performance.
Furthermore, it is likely that the culture, social structure and demography of contemporary Asia, Europe and the Americas explain some of these differences in the effect of preschool education.
We still don't have a ranking of kindergartens although no doubt that will happen one day. But there is now a country ranking of early education prepared by the Economist Intelligence Unit for the Lien Foundation "a Singapore philanthropic house noted for its model of radical philanthropy" (Economist print edition, 30/06/12). Details can be found here.
The ranking combines four indicators, Availability, Affordability, Quality of the Preschool Environment and Social Context, "which examines how healthy and ready for school children are".
The top five are:
1. Finland
2. Sweden
3. Norway
4. UK
5. Belgium
The bottom five are:
41. Vietnam
42. China
43. Philippines
44. Indonesia
45. India
It is interesting to compare this with the ability of fifteen year old students as measured by the 2009 PISA report. Finland, which is top of the preschool ranking, also scores well on all three sections of the PISA rankings, Reading, Science and Maths.
However, the United Kingdom which is ranked 4th in the preschool rankings does no better than the OECD average in the PISA ranking for Reading and Maths although not too badly in Science.
At the bottom of the preschool rankings we find China and India. In the PISA plus ranking India was represented by just two states and they preformed miserably.
On the other hand, China whose preschool system is ranked 42nd out of 45, does very well, ranking first in all three sections, or rather Shanghai does very well.
A warning is needed here. Mainland China is represented in the PISA rankings only by Shanghai. The results would almost certainly be very different if they included the whole of China including its huge rural hinterland. It is likely that if all of China had been assessed its scores would have been something like Taiwan (Chinese Taipei), 495 instead of 556 for Reading, 543 instead of 600 for Maths and 520 instead of 575 for Science.
Even so, it is striking that the UK could have such highly ranked preschools and such modest secondary school achievement while China has lowly ranked preschools but such high scores at secondary level if we just consider Shanghai or scores similar to Taiwan's if we estimate what the nationwide level might be.
There are other apparent anomalies. For example, Singapore is only slightly better than Mexico on the preschool rankings but forges well ahead on the PISA rankings.
It could be that preschool education does not have much to do with long term achievement. Also, some of the criteria, such as how healthy children are, may not have much do with anything that a preschool does or can do. Nor should we forget the preschool ranking deals with input while the PISA rankings are about performance.
Furthermore, it is likely that the culture, social structure and demography of contemporary Asia, Europe and the Americas explain some of these differences in the effect of preschool education.
Are International Students an Indicator of Quality?
From BBC News
Some 2,600 foreign students affected by the London Metropolitan University (LMU) visa ban have been given until at least 1 December to find a new course.
The UK Border Agency says it will write to students after 1 October and "will ensure you have 60 days" to make a new student application or leave the UK.
On Thursday, the UKBA revoked LMU's licence to authorise non-EU visas. Ministers said it was failing to monitor student attendance.
Apparently a substantial number of students did not have valid visas or had not been properly tested for spoken English and in many cases it was not possible to even tell if they were attending class.
From LMU's home page
At London Metropolitan University we believe that everyone has the right to an affordable quality education. Our fees for 2012/13 have been set at levels significantly lower than other Universities, and our courses recently received top marks from the UK's Quality Assurance Agency. We are committed to delivering affordable quality education, and are proud of the diversity & achievements of our students, alumni and staff.
Here at London Met we put our students at the centre of all we do.
London Met is a great place to study, located in the heart of one of the world's most exciting cities.
We stand out because we offer courses of quality, in a vibrant, socially diverse environment, which will help launch your career.
We are committed to transforming lives, meeting needs and building careers.
Notice the bit about everyone.
Never trust a university that talks about transforming lives.
Subscribe to:
Posts (Atom)