Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Tuesday, September 20, 2016
The long wait for the THE rankings is nearly over ...
but we can still have some fun reading the latest post at ROARS by Guiseppe de Nicolao.
Monday, September 19, 2016
Update on previous post
The reputation data used by THE in the 2016 world rankings, for which the world is breathlessly waiting, is that which was used in their reputation rankings released last May and collected between January and March.
Therefore, the distribution of responses from disciplinary groups this year was 9% for the arts and humanities and 15% for social sciences and 13% for business (28% for the last two combined). In 2015 it was 16% for the arts and humanities and 19% for the social sciences (which then included business).
Since UK universities are relatively strong in the humanities and Asian universities relatively strong in business studies the result of this was a shift in the reputation rankings away from the UK and towards Asian universities. Oxford fell from 3rd (score 80.4) to 5th (score 69.1) in the reputation rankings and Bristol and Durham dropped out of the top 100 while Tsinghua University rose from 26th place to 18th, Peking University from 32nd to 21st and Seoul National University from 51-60 to 45th.
In the forthcoming world rankings British universities (although threatened by Brexit) ought to do better because of the inclusion of books in the publications and citations indicators and certain Asian universities, but by no means all, may do better because their citations for mega-projects will be partially restored.
Notice that THE have also said that this year they will combine the reputation scores for 2015 and 2016, something that is unprecedented. Presumably this will reduce the fall of UK universities in the reputation survey. Combined with the inclusion of books in the database, this may mean that UK universities may not fall this year and may even go up a bit (ATBB).
Therefore, the distribution of responses from disciplinary groups this year was 9% for the arts and humanities and 15% for social sciences and 13% for business (28% for the last two combined). In 2015 it was 16% for the arts and humanities and 19% for the social sciences (which then included business).
Since UK universities are relatively strong in the humanities and Asian universities relatively strong in business studies the result of this was a shift in the reputation rankings away from the UK and towards Asian universities. Oxford fell from 3rd (score 80.4) to 5th (score 69.1) in the reputation rankings and Bristol and Durham dropped out of the top 100 while Tsinghua University rose from 26th place to 18th, Peking University from 32nd to 21st and Seoul National University from 51-60 to 45th.
In the forthcoming world rankings British universities (although threatened by Brexit) ought to do better because of the inclusion of books in the publications and citations indicators and certain Asian universities, but by no means all, may do better because their citations for mega-projects will be partially restored.
Notice that THE have also said that this year they will combine the reputation scores for 2015 and 2016, something that is unprecedented. Presumably this will reduce the fall of UK universities in the reputation survey. Combined with the inclusion of books in the database, this may mean that UK universities may not fall this year and may even go up a bit (ATBB).
Thursday, September 15, 2016
Some predictions for the THE rankings and summit
Here are my predictions for the THE rankings on the 21st and academic summit on the 26th -28th.
- Donald Trump will not be invited to give a keynote address.
- The decline of US public universities will be blamed on government spending cuts.
- British universities will be found to be in mortal danger from Brexit and visa controls.
- Phil Baty will give a rankings "masterclass" but will have to apologise to feminists because he couldn't think of anything else to call it.
- The words 'prestige' and 'prestigious' will be used more times than in the novel by Christopher Priest or the film by Christopher Nolan
- The counting of books will help British universities, especially Oxford and Cambridge, but they will still be threatened by Brexit.
- The partial reinclusion of citations of papers with 1,000+ authors, mainly in physics, will lead to a modest recovery of some universities in France, Korea, Japan and Turkey. The rise of Asia will resume.
- Since the host city or university of THE summits somehow manages to get in the top ten, Berkeley will recover from last year's fall to 13th place.
- Last year the percentage of survey responses from the arts and humanities fell to 9% from 16%. I suspect that this year the fall might be reversed and that the reason THE are combining the reputation survey results for this year and 2015 is to reduce the swing back to UK universities, which are suffering because of visa controls and Brexit.
- At least one of the above will be wrong..
Sunday, September 11, 2016
Waiting for the THE world rankings
The world, having recovered from the shocks of the Shanghai, QS and RUR rankings, now waits for the THE world rankings, especially the research impact indicator measured by field normalised citations.
It might be helpful to show the top 5 universities for this criterion since 2010-11.
2010-11
1. Caltech
2. MIT
3. Princeton
4. Alexandria University
5. UC Santa Cruz
2011-12
1. Princeton
2. MIT
3. Caltech
4. UC Santa Barbara
5. Rice University
2012-13
1. Rice University
2. National Research Nuclear University MePhI
3. MIT
4. UC Santa Cruz
5. Princeton
2013-14
1. MIT
2. Tokyo Metropolitan University
3. Rice University
4. UC Santa Cruz
5. Caltech
2014-15
1. MIT
2. UC Santa Cruz
3. Tokyo Metropolitan University
4. Rice University
5. Caltech
2015-16
1. St George's, University of London
2. Stanford University
3. UC Santa Cruz
4 Caltech
5. Harvard
Notice that no university has been in the top five for citations in every year.
Last year THE introduced some changes to this indicator, one of which was to exclude papers with more than 1000 authors from the citation count. This, along with a dilution of the regional modification that gave a bonus to universities in low scoring countries, had a devastating effect on some universities in France, Korea, Japan, Morocco, Chile and Turkey.
The citations indicator has always been an embarrassment to THE, throwing up a number of improbable front runners aka previously undiscovered pockets of excellence. Last year they introduced some reforms but not enough. It would be a good idea for THE to get rid of the regional modification altogether, to introduce full scale fractional counting, to reduce the weighting assigned to citations, to exclude self-citations and secondary affiliations and to include more than one measure of research impact and research quality.
Excluding the papers, mainly in particle physics, with 1,000 plus "authors" meant avoiding the bizarre situation where a contributor to a single paper with 2,000 authors and 2,000 citations would get the same credit as 1,000 authors writing a thousand papers each of which had been cited twice.
But this measure also meant that some of the most significant scientific activity of the century would not be counted in the rankings. The best solution would have been fractional counting, distributing the citations among all of the institutions or contributors, and in fact THE did this for their pilot African rankings at the University of Johannesburg.
Now, THE have announced a change for this year's rankings. According to their data chief Duncan Ross.
" Last year we excluded a small number of papers with more than 1,000 authors. I won’t rehearse the arguments for their exclusion here, but we said at the time that we would try to identify a way to re-include them that would prevent the distorting effect that they had on the overall metric for a few universities.
This year they are included – although they will be treated differently from other papers. Every university with researchers who author a kilo-author paper will receive at least 5 per cent credit for the paper – rising proportionally to the number of authors that the university has.
This is the first time that we have used a proportional measure in our citations score, and we will be monitoring it with interest.
We’re also pleased that this year the calculation of the Times Higher Education World University Rankings has been subject to independent audit by professional services firm PricewaterhouseCoopers (PwC). "
This could have perverse consequences. If an institution has one contributor to a 1,000 author paper with 2,000 citations then that author will get 2,000 citations for the university. But if there are 1001 authors then he or she would get only 50 citations.We’re also pleased that this year the calculation of the Times Higher Education World University Rankings has been subject to independent audit by professional services firm PricewaterhouseCoopers (PwC). "
It is possible that we will see a cluster of papers with 998, 999, 1000 authors as institutions remove their researchers from the author lists or project leaders start capping the number of contributors.
This could be a way of finding out if research intensive universities really do care about the THE rankings.
Similarly, QS now excludes papers with more than ten contributing institutions. If researchers are concerned about the QS rankings they will ensure that the number of institutions does not go above ten. Let's see if we start getting large numbers of papers with ten institutions but none or few with 11, 12 13 etc.
I am wondering why THE would bother introducing this relatively small change. Wouldn't it make more sense to introduce a lot of small changes all at once and get the resulting volatility over and done with?
I wonder if this has something to do with the THE world academic summit being held at Berkeley on 26-28 September in cooperation with UC Berkeley. Last year Berkeley fell from 8th to 13th in the THE world rankings. Since it is a contributor to several multi-contributor papers it is possible that the partial re-inclusion of hyper-papers will help the university back into the top ten.
Wednesday, September 07, 2016
More on Brexitophobic hysteria
John Field, an expert on lifelong learning, comments on the growing Brexit hysteria blowing through academia.
Professor Field quotes the Vice Chancellor of the University of York:
"York, along with many other British universities, appears to have fallen in the QS league table because of concerns about the impact of Brexit; specifically, this has been attributed to worries about future access to research funding and whether we will be able to recruit excellent academic staff and students from all over the world."
Professor Field quotes the Vice Chancellor of the University of York:
"York, along with many other British universities, appears to have fallen in the QS league table because of concerns about the impact of Brexit; specifically, this has been attributed to worries about future access to research funding and whether we will be able to recruit excellent academic staff and students from all over the world."
The shadow of Brexit falls across the land
The western chattering and scribbling classes sometimes like to reflect on their superiority to the pre-scientific attitudes of the local peasantry, astrology, nationalism and religion and things like that. But it seems that the credentialled elite of Britain are now in the grip of a great fear of an all pervading spirit called Brexit whose malign power is unlimited in time and space.
Thus the Independent tells us that university rankings (QS in this case) show that "post Brexit uncertainty and long-term funding issues" have hit UK higher education.
The Guardian implies that Brexit has something to do with the decline of British universities in the rankings without actually saying so.
"British universities have taken a tumble in the latest international rankings, as concern persists about the potential impact of Brexit on the country’s higher education sector. "
Many British universities have fallen in the QS rankings this year but the idea that Brexit has anything to do with it is nonsense. The Brexit vote was on June 23rd, well after QS's deadlines for submitting respondents for the reputation surveys and updating institutional data. The citations indicator refers to the period 2011-2015.
The belief that rankings reveal the dire effects of funding cuts and immigration restrictions is somewhat more plausible but fundamentally untenable.
Certainly, British universities have taken some blows in the QS rankings this year. Of the 18 universities in the top 100 in 2015 two are in the same place this year, two have risen and 14 have fallen. This is associated with a general decline in performance in the academic reputation indicator which accounts for 40% of the overall score.
Of those 18 universities three, Oxford, Cambridge and Edinburgh, hold the same rank in the academic reputation indicator, one, King's College London, has risen and fourteen are down.
The idea that the reputation of British universities is suffering because survey respondents have heard that the UK government is cutting spending or tightening up on visa regulations is based on some unlikely assumptions about how researchers go about completing reputation surveys.
Do researchers really base their assessment of research quality on media headlines, often inaccurate and alarmist? Or do they make an honest assessment of performance over the last few years or even decades? Or do they vote according to their self interest, nominating their almae matres or former employers?
I suspect that the decline of British universities in the QS reputation indicator has little to do with perceptions about British universities and a lot more to do with growing sophistication about and interest in rankings in the rest of the world, particularly in East Asia and maybe parts of continental Europe.
What was that about the origins of science in seventeenth century England?
Trigger warning
If you're triggered by just about anything, don't read this.
Those who dislike inherited privilege will be entertained by this account of the last days of Charles II. it is from a post by Gregory Cochran at the blog, West Hunter.
It seems that there has been a little bit of progress over the centuries. The future Charles III has a thing about homeopathy, expensive pseudoscientific rubbish but at least it's harmless.
I can't help wondering whether the malign spirit of pseudoscience has now taken refuge in university faculties of social science with their endless crises of irreproducible research.
"Back in the good old days, Charles II, age 53, had a fit one Sunday evening, while fondling two of his mistresses.
Monday they bled him (cupping and scarifying) of eight ounces of blood. Followed by an antimony emetic, vitriol in peony water, purgative pills, and a clyster. Followed by another clyster after two hours. Then syrup of blackthorn, more antimony, and rock salt. Next, more laxatives, white hellebore root up the nostrils. Powdered cowslip flowers. More purgatives. Then Spanish Fly. They shaved his head and stuck blistering plasters all over it, plastered the soles of his feet with tar and pigeon-dung, then said good-night.
If you're triggered by just about anything, don't read this.
Those who dislike inherited privilege will be entertained by this account of the last days of Charles II. it is from a post by Gregory Cochran at the blog, West Hunter.
It seems that there has been a little bit of progress over the centuries. The future Charles III has a thing about homeopathy, expensive pseudoscientific rubbish but at least it's harmless.
I can't help wondering whether the malign spirit of pseudoscience has now taken refuge in university faculties of social science with their endless crises of irreproducible research.
"Back in the good old days, Charles II, age 53, had a fit one Sunday evening, while fondling two of his mistresses.
Monday they bled him (cupping and scarifying) of eight ounces of blood. Followed by an antimony emetic, vitriol in peony water, purgative pills, and a clyster. Followed by another clyster after two hours. Then syrup of blackthorn, more antimony, and rock salt. Next, more laxatives, white hellebore root up the nostrils. Powdered cowslip flowers. More purgatives. Then Spanish Fly. They shaved his head and stuck blistering plasters all over it, plastered the soles of his feet with tar and pigeon-dung, then said good-night.
Tuesday. ten more ounces of blood, a gargle of elm in syrup of mallow, and a julep of black cherry, peony, crushed pearls, and white sugar candy.
Wednesday. Things looked good:: only senna pods infused in spring water, along with white wine and nutmeg.
Thursday. More fits. They gave him a spirituous draft made from the skull of a man who had died a violent death. Peruvian bark, repeatedly, interspersed with more human skull. Didn’t work.
Friday. The king was worse. He tells them not to let poor Nelly starve. They try the Oriental Bezoar Stone, and more bleeding. Dies at noon."
Saturday, September 03, 2016
Another Important Ranking
Ranking fans have a busy week ahead of them. On Tuesday the QS world rankings will be announced and results will probably start leaking on Sunday or Monday. Then there will be the Shanghai broad subject rankings.
Times Higher Education have promised a major revelation on Monday. I suspect that this might just be the top ten or twenty of the world rankings or a preview of their new US college rankings.
But this ranking might be more important. Hackerrank, "a platform that ranks engineers based on their coding skills and helps companies discover talent faster", has just published a ranking of countries according to the speed and accuracy with which developers can solve a variety of coding challenges.
China is first and Russia second.
The USA is 28th and the UK 29th. Eastern Europe and East Asia generally perform well.
For once, there is some fairly good news for Africa and the Muslim world: Turkey is 30th, Egypt 42nd, Bangladesh 44th and Nigeria 48th.
The top ten are
1. China
2. Russia
3. Poland
4. Switzerland
5. Hungary
6. Japan
7. Taiwan
8. France
9. Czech republic
10. Italy
Tuesday, August 30, 2016
The Pursuit of Excellence
If you are wondering how they did it, see the story in the Hindustan Times.
"The Institute of Excellence and Higher Education (IEHE) in Bhopal improved its teacher and student ratio from 1:47 to 1:24 a day before the National Assessment and Accreditation Council (NAAC) team was scheduled to visit to retain the institute’s Grade ‘A’.
A three-member NAAC team, led by former vice chancellor SK Singh, will reach on Monday and inspect the institute in 24 sessions.
IEHE, which was facing hardships due to shortage of teachers, appointed 54 guest faculties in a week. The strength of teachers increased from 58 to 112."
There is nothing very unusual about this sort of thing. There have been, for example, suspicions about some British universities offering "relatively short- term contracts" that expire just after the Research Excellence Framework (REF) assessment is completed.
Tuesday, August 23, 2016
The Naming of Universities
You can tell a few things about universities from their names. If, for example, a university has a direction in its name then it is usually not ranked very highly: University of East London, Southern Illinois University. Those that are named after long dead people -- Yale, Duke -- are often but not always -- Bishop Grosseteste -- very prestigious.
It might be an idea to have a ranking of institutions with the most interesting or strangest names. After all nearly everything else about higher education is ranked somewhere or other. California University of Pennsylvania should be near the top. And of course there is Hamburger University and Butler University. Or, from a few years ago, The Universal Institute of Food Management and Hotel Administration I was Called by the Almighty and I Said Yes My Lord, which was actually a restaurant somewhere along the road from Maiduguri to Kano in Nigeria.
Another high flier might be the Lovely Professional University, a "semi-residential university college in North India", which is ranked 4326th in the world and 213rd in India by Webometrics. I doubt if it will get many votes in the QS or THE academic surveys unless it changes its name which I suspect might be a literal translation from Hindi or Sanskrit.
Sunday, August 21, 2016
Worth Watching
Video
Salvatore Babones, Gaming the Rankings Game: University Rankings and the Future of the University
Thursday, August 18, 2016
Shanghai Rankings 2016 Update
An interesting tweet from qwertzuiop reports that the average change in rank position in this year's Shanghai rankings was 32 compared to 11.7 between 2014 and 2015. Changes in methodology, even simple ones, can lead to a lot of churning.
Meanwhile, here are the correlations between the various indicators in the ranking. In general, it seems that the indicators are not measuring exactly the same thing and they do not raise red flags by showing a low or zero association with each other.
The lowest correlations are between publications and alumni and award (alumni and faculty winning Nobel and Fields awards). Publications are papers in the Science Citation Index and the Social Science Citation Index in 2015 while the alumni and award indicators go back several decades. Time makes a difference and as a measure of contemporary research excellence Nobel and Fields awards may be losing their relevance.
PCP | |||||||
All correlations are significant at the 0.01 level (2 tailed).
N is 500 in all cases except for Nature and Science where it is 497
Wednesday, August 17, 2016
The Shanghai Rankings: More Interesting This Year
The Shanghai rankings are usually the most stable and therefore the least interesting (for journalists, politicians and bureaucrat) of the current array.
This year, however, they are quite volatile. The reason for that is that the Shanghai Ranking Consultancy has completed the transition from the old to the new lists of highly cited researchers supplied by Thomson Reuters. In 2014 and 2015 they used both lists with an equal weighting which has reduced the abruptness of the transition. In addition, the rankings now count only primary affiliations. As a result there have been enough ascents and descents to gladden the hearts of higher education journalists and experts.
It should be noted that the effect of this is largely to accelerate trends that were in progress anyway. The old list was clearly out of date and it was time for a new one.
First, to my predictions. Harvard is still number one. Wisconsin at Madison, Rutgers and Virginia Polytechnic Institute have all fallen. Aalborg, Nanyang Technological University, Peking, Chiba and Tsinghua have risen. Peking and Tsinghua are now in the top 100 and heading for the top 50.
But the University of Tehran has not risen. It has fallen by 84 places, presumably because it lost a highly cited researcher during the second half of 2015.
Overall, the rankings provide more evidence for the rise of China with two universities in the top 100 and 54 in the top 500 compared with none and 44 last year, but not the rest of Asia. South Korea has gone from 12 in the top 500 to 11, Japan from 18 to 16 and Israel 6 to 5. India still has only one representative in the top 500 and Malaysia two.
Meanwhile the USA now has 137 universities in the top 500 compared with 146 last year
Rapidly rising institutions include Toulouse School of Economics, from 375th to 265th, largely because of being given a free pass this year for papers in Nature and Science, University of the Witwatersrand from 244th to 204th, University of Queensland from 77th to 55th and , King Abdullah University of Science and technology from 352nd to 254th.
Kwazulu-Natal has fallen from 413th to 494th, Dartmouth College from 215th to 271st and Universiti Malaya from 353rd to 413th.
Saturday, August 13, 2016
My Predictions for the Shanghai Rankings
Watch this Nate Silver.
In the latest edition of the Shanghai Academic Ranking of World Universities (ARWU) to be announced on Monday, the first place will go to Harvard.
The methodology for this prediction is based on an extremely complex and sophisticated algorithm that incorporates a large number of variables and will remain a secret for the moment.
Now for some easier predictions.
The Shanghai rankings are generally famous for their stability and consistency which makes them rather boring for journalists and naive administrators. No shocking headlines about catastrophic plunges in the rankings after the latest vandalism by government Scrooges.
But the Shanghai rankers have had problems with their highly cited researchers indicator. Thomson Reuters have stopped adding to their old list of highly cited researchers and have published a new one. The Shanghai Ranking Consultancy combined the two lists in 2014 and 2015 and have said that this year only the new list will be used in calculating the overall score.
Last year Shanghai published a list of scores for the highly cited indicator in 2013 (the old list), combined scores in 2015 and scores if the new list alone had been used.
Here are the universities that will rise or fall by ten points (two points in the weighted overall rankings) as the new list replaces the combined lists. This assumes that universities have not recruited or lost highly cited researchers dring 2015. If they have then the predictions will be incorrect. Also, changes in the highly cited indicator may be balanced by changes in other indicators
Predicted to Fall
University of Wisconsin, Madison
Rutgers: State University of New Jersey
Virginia Polytechnic Institute
Predicted to Rise
Aalborg University
Nanyang Technological University
Peking University
Chiba University
Tsinghua University
University of Tehran
Thursday, August 11, 2016
Value Added Ranking
There has been a lot of talk about ranking universities by factors other than the usual mix of contributions to research and innovation, reputation surveys and inputs such as spending, teaching resources or student quality.
The emerging idea is that universities should be assessed according to their ability to teach students or to inculcate desirable skills or attributes.
Much of this is powered by the growing awareness that American and European secondary schools are failing to produce sufficient numbers of students with the ability to undertake and complete anything that could realistically be called a university education. It is unlikely that this is the fault of the schools. The unavoidable verdict of recent research is that the problem with schools has very little to do with institutional racism, a lack of grit, resilience or the current X factor or the failure to adopt Finnish, Chinese or Singaporean teaching methods. It is simply that students entering the school system are on average less intelligent than they were and those leaving are consequently also less intelligent.
There is now a market for rankings that will measure the quality of universities not by their resources, wealth or research output but by their ability to add value to students and to prepare them for employment or to enable them to complete their courses.
This could, however, lead to massively perverse consequences. If universities are assessed according to the percentage of entrants who graduate within a certain period or their employability then there could be a temptation to dilute graduation requirements .
Nevertheless, the idea of adding value is one that is clearly becoming more popular. It can be seen in the attempt to introduce a national rating system in the US and in the UK to use the proposed Teaching Excellence Framework (TEF) to rank universities.
One UK ranking that includes a value added measure is the Guardian University Guide. This includes eight indicators, three of which three of which measure student satisfaction. Other indicators are staff student ratio and spending per student. There is also a measure of student outcomes, that is graduate level employment or entry into a postgraduate course after six months, one of the quality of students measured by A level qualifications and one a measure of value added, that is the difference between the students entry level exam results and their eventual degree results.
It is therefore possible to get a rough idea of what factors might actually produce positive student outcomes.
The overall ranking for 2015-16 starts by being quite conventional with the top three places going to Cambridge, Oxford and St Andrews. Some might be surprised by Exeter in 9th place and Loughborough in 11th, ahead of LSE and UCL.
Measuring student quality by exam scores produces unsurprising results at the top. Cambridge is first followed by Oxford and Imperial. For staff student ratio the top three are UCL, Oxford and SOAS and for spending per student Oxford, Cambridge and the University of the Arts London.
For student satisfaction with courses, Bath, Keele and UEA are in the lead while Oxford is 5th and Cambridge 12th. It's when we look at the Value Added that we find some really unusual results. The top three are Gloucester, Edinburgh and Abertay.
After plugging the indicator scores into an SPSS file we can calculate the correlations between the desired outcome, that is graduate level employment or postgraduate study and a variety of possible associated factors.
Here in descending order are the correlations with career prospects:
average entry tariff .820
student staff ratio .647
spending per student .569
satisfaction with course .559
satisfaction with teaching .531
value added .335
satisfaction with feedback -.171.
It would seem that if you want to know which university is best for career prospects then the most important piece of data is the average academic ability of the students. The student staff ratio and money spent are also significant as is satisfaction with courses and teaching.
The correlation between value added and career prospects is much less and rather modest.
The universities were divided into thirds according to average entry tariff. In the top third of universities there was a strong correlation between career prospects and average entry level tariff, .628, and a modest one with spending, .355. Nothing else was associated with career success.
In the middle third the factor most associated with career prospects was course satisfaction, .498, followed by average entry tariff, .449, staff student ratio, .436, and satisfaction with teaching, .362. Satisfaction with feedback and value added were insignificant.
However, for the least selective third of universities, the picture was rather different. The factor most strongly associated with career success was satisfaction with feedback, .493, followed by valued added, .479, course satisfaction, .470, satisfaction with teaching, .439, and average entry tariff, .401. The relationship with spending and staff student ratio was insignificant .
The evidence of the Guardian rankings is that value added would only be of interest to students at or applying to the least selective third of UK universities. For the rest it is of no importance. It is debatable whether it is worth making it the centre of a new set of rankings.
Worth Reading 7
Richard Holmes
ABSTRACT
This
paper analyses the global university rankings introduced by Times Higher Education (THE) in partnership with Thomson Reuters in
2010 after the magazine ended its association with its former data provider Quacquarelli Symonds. The distinctive features of the new
rankings included a new procedure for determining the
choice and weighting of the various indicators, new criteria for inclusion in
and exclusion from the rankings, a revised academic reputation survey, the
introduction of an indicator that attempted to measure innovation, the addition
of a third measure of internationalization, the use of several indicators
related to teaching, the bundling of indicators into groups, and, most
significantly, the employment of a very distinctive measure of research impact
with an unprecedentedly large weighting. The rankings met with little
enthusiasm in 2010 but by 2014 were regarded with some favour by administrators
and policy makers despite the reservations and criticism of informed observers
and the unusual scores produced by the citations indicator. In 2014, THE announced that the partnership would
come to an end and that the magazine would collect its own data. There were
some changes in 2015 but the basic structure established in 2010 and 2011
remained intact.
Forthcoming in Asian Journal of University Education, December 2015. Prepublication copy can be accessed here.
Wednesday, August 03, 2016
Ghost Writers
An article by Chris Havergal in Times Higher Education reports on research by Lisa Lines , a lecturer at the University of New South Wales, in Teaching in Higher Education (behind a pay wall) that suggests that the output of ghost written student essays is probably greater than expected.
The researcher had ordered undergraduate and master's essays in history and then had them marked by "leading academics." Of the 13 undergraduate essays only two received a failing grade while six out of 13 master's failed and seven passed
Lines says that the quality of the purchased essays was surprisingly high.
Possibly. Or you could say that the standard of marking was surprising low. Note that this was at a university that is in the top 150 in the world according to ARWU.
Havergal quotes Lines as saying:
“It is clear that this type of cheating is virtually undetectable by academics when students take precautions against being caught,” she concludes.
“This fact, coupled with the study’s findings that the quality of essays available for purchase is sufficient to receive a passing grade or better, reveals a very troubling situation for universities and poses a real threat to academic integrity and standards, and public perceptions of these.”
The problem lies not with dishonest student or crooked essay writers but with corrupt selection practices that admit academically incompetent students and a dysfunctional employment system. If you have students that cannot write and intelligent graduates who cannot find work then ghost writing is inevitable.
Monday, July 25, 2016
Looks Like THE isn't Big in Japan Anymore
It has taken a long time but it seems that Japanese universities are getting a little irritated about the Times Higher Education (THE) World and Asian University Rankings.
I have commented on the THE Asian rankings here, here and here.
According to the Nikkei Asian Review,
Research University 11, a consortium of Japan's top 11 universities, issued a statement earlier this month that the Times Higher Education ranking should not be used to determine national policy or as an achievement indicator.
Another umbrella group, the Research University Network of Japan, which includes universities and research institutions, has opposed the ranking every year Japanese universities have taken big tumbles.
and
So achieving a higher ranking does not necessarily correlate with providing better educations and research opportunities.
For some universities, there is another worry -- politics. The Japanese government in 2013 said it would aim to ensure that Japanese universities rank among the world's top 100 over the following decade. Now, Japanese universities are required to develop specific strategies to help the government reach this "revitalization" goal.
Subscribe to:
Posts (Atom)