The Australian newspaper The Age has a piece by Erica Cervini on how she allowed her dog to complete the QS academic reputation survey on the quality of veterinary schools.
She doesn't elaborate on how the dog chose the schools. Was it by barking or tail wagging when shown pictures of the buildings?
Seriously though, she does have a point. Can QS stop people signing up just to support their employer or outvote their rivals?
To be fair, QS are aware that their surveys might be manipulated and have taken steps over the years to prevent this by such means as forbidding respondents from voting for their declared employer or repeat voting from the same computer. Even so, it seems that some universities, especially in Latin America, are getting scores in the reputation surveys that appear too high,especially when compared with their overall scores. In the employer survey the Pontifical Catholic University of Chile is 56th (overall 167) and the University of Buenos Aires 49th (overall 198). In the academic survey the University of Sao Paulo is 54th (overall 132 and the National Autonomous University of Mexico 55th (overall 175).
QS are apparently considering reforming their citations per faculty indicator and allowing unchanged responses to the surveys to be recycled for five instead of three years. This is welcome but a more rigorous overhaul of the reputation indicators is sorely needed.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Tuesday, July 28, 2015
Thursday, July 23, 2015
Perfect Storm Heading for Tokyo Metropolitan University
Seen on the Times Higher Education website today:
Tokyo Metropolitan University got its perfect score largely because it was one of hundreds of institutions contributing to a few publications from the Large Hadron Collider project. In their recent experimental African rankings THE started using fractionalized counting of citations. If THE use this method in the coming world rankings then TMU will surely suffer a dramatic fall in the citations indicator.
I would not like to be the president of TMU on September 30th.
Tokyo Metropolitan University
World's Best University
Scored a Perfect 100.00 for Two Years in Citations Sector
From TMU to the World
Tokyo Metropolitan University got its perfect score largely because it was one of hundreds of institutions contributing to a few publications from the Large Hadron Collider project. In their recent experimental African rankings THE started using fractionalized counting of citations. If THE use this method in the coming world rankings then TMU will surely suffer a dramatic fall in the citations indicator.
I would not like to be the president of TMU on September 30th.
Wednesday, July 22, 2015
Recommended Reading
Anybody interested in educational policy, especially the never ending campaign to close gaps of one sort or another or the oddities of university rankings should take a look at chapter four of Jordan Ellenberg's How not to be wrong: The power of mathematical thinking which is about the obvious -- or ought to be obvious observation -- that smaller populations are more variable.
He notes that South Dakota is top of the league for brain cancer while North Dakota is near the bottom. What makes the difference? It is just that the bigger the population the more likely it is that outliers will be diluted by a great mass of mediocrity. So, extreme scores tend to crop up in small places or small samples.
Similarly when he tossed coins ten at a time he came up head counts ranging from 3 to 9 out of ten.
When he tossed them 100 at a time he got counts ranging from 45 to 60.
When he (actually his computer program) tossed them 1,000 times, the counts ranged from 462 to 537.
It is worth remembering this when a study with a double digit sample is published showing the latest way to close one of achievement gaps or a very small school in a rural state somewhere starts boosting the test scores of underperforming students or a few test takers reveal that the national IQ is imploding. Or the studies fail to be replicated, if indeed anyone tries.
Or university rankings that show very small or very unproductive institutions having an enormous research impact measured by citations.
He notes that South Dakota is top of the league for brain cancer while North Dakota is near the bottom. What makes the difference? It is just that the bigger the population the more likely it is that outliers will be diluted by a great mass of mediocrity. So, extreme scores tend to crop up in small places or small samples.
Similarly when he tossed coins ten at a time he came up head counts ranging from 3 to 9 out of ten.
When he tossed them 100 at a time he got counts ranging from 45 to 60.
When he (actually his computer program) tossed them 1,000 times, the counts ranged from 462 to 537.
It is worth remembering this when a study with a double digit sample is published showing the latest way to close one of achievement gaps or a very small school in a rural state somewhere starts boosting the test scores of underperforming students or a few test takers reveal that the national IQ is imploding. Or the studies fail to be replicated, if indeed anyone tries.
Or university rankings that show very small or very unproductive institutions having an enormous research impact measured by citations.
Saturday, July 18, 2015
In the QS BRICS rankings nearly everybody gets a prize
There is a growing trend towards specialised and regional university rankings. The magic of this is that they can provide something for almost everybody. QS recently published its latest BRICS rankings which combined data from five very different university systems. The result was a triumphant (almost) success for everybody (almost).
Here are some highlights that QS could use in selling the BRICS rankings or an expanded version.
Russian universities are ahead of everybody else for teaching quality.
The top 21 universities in the BRICS for Faculty Student Ratio (perhaps not a perfect proxy for teaching excellence) are Russian, headed by Bauman Moscow State Technical University. Imagine what Russian universities could do if QS recognised the importance of teaching and increased the weighting for this indicator.
India performs excellently for Faculty with a Ph D.
Out of the top 15 for this category, ten are Indian and all of these get the maximum score of 100. Of the other five, four are Brazilian and one Chinese. If only QS realised the importance of a highly qualified faculty, India would do much better in the overall rankings.
South Africa takes five out of the first six places for international faculty.
China has four out of five top places for academic reputation and employer reputation.
Meanwhile a Brazilian university is first for international faculty and another is third for academic reputation.
It seems that with rankings like these a lot depends on the weighting assigned to the various indicators.
Yes, going to the library might be good for you
A study by Felly Chiteng Kot and Jennifer L. Jones has found that
"using a given library resource was associated with a small, but also meaningful, gain in first-term grade point average, net of other factors."
But, correlation does not necessarily mean causation. Could it be that bright people like to go to libraries?
Still, most students are likely to behave better in the library than other places, so let's not quibble too much.
"using a given library resource was associated with a small, but also meaningful, gain in first-term grade point average, net of other factors."
But, correlation does not necessarily mean causation. Could it be that bright people like to go to libraries?
Still, most students are likely to behave better in the library than other places, so let's not quibble too much.
Tuesday, July 14, 2015
Implications of the THE African Pilot Ranking
The most interesting thing about THE's experimental African ranking is the use of fractionalised counting of citations. This means that the total number of citations is divided by the number of institutions contributing to a publication. Previously, the method used in THE rankings was to assign the total citations of a paper to all of the institutions that contributed just as though each one had been the only contributor. This has produced some very questionable results with universities that were excellent but very specialised or just generally unproductive scoring remarkably high scores for citations. Panjab university, Tokyo Metropolitan University, Scuola Normale Superiore Pisa, Federico Santa Maria Technical University and Moscow State Engineering Physics Institute have all had moments of glory in the THE rankings because of citation scores that were dramatically higher than their scores for research or any other indicator or group of indicators.
The new method, if applied generally, is likely to see a significant reduction in the scores given to such universities. We can estimate what might happen by looking at the four universities that are included in both the African pilot ranking and last year's world rankings, Cape Town, Witwatersrand, Stellenbosch and Universite Marrakesh Cadi Ayyad, Morocco.
In the world rankings these universities received scores of 86.6, 67.3, 45.6 and 83 respectively. The score of 83 for Universite Cadi Ayyad resulted very largely from its contributions to several publications from the Large Hadron Collider project, one of which has been cited over 2,000 times, a low overall output of papers and the "regional modification" that gave a big boost to low scoring countries. The scores for the three South African universities reflected a larger total output and citations over a broad range of disciplines.
In the African pilot ranking the scores for citations were Cape Town 90.90, Witwatersrand 99.76, Stellenbosch 95.48 and Cadi Ayyad 78.61. The high scores for the South African institutions reflect a much lower mean score than in the world rankings.
The fall in Cadi Ayyad's citation score from 3.6 points below Cape Town to 21.3 below and its falling behind Stellenbosch and Witwatersrand presumably reflect the impact of fractionalised counting.
This suggests that if fractionalised counting is used in the coming World University Rankings many small or specialised institutions will suffer and there will be a lot of reshuffling.
The new method, if applied generally, is likely to see a significant reduction in the scores given to such universities. We can estimate what might happen by looking at the four universities that are included in both the African pilot ranking and last year's world rankings, Cape Town, Witwatersrand, Stellenbosch and Universite Marrakesh Cadi Ayyad, Morocco.
In the world rankings these universities received scores of 86.6, 67.3, 45.6 and 83 respectively. The score of 83 for Universite Cadi Ayyad resulted very largely from its contributions to several publications from the Large Hadron Collider project, one of which has been cited over 2,000 times, a low overall output of papers and the "regional modification" that gave a big boost to low scoring countries. The scores for the three South African universities reflected a larger total output and citations over a broad range of disciplines.
In the African pilot ranking the scores for citations were Cape Town 90.90, Witwatersrand 99.76, Stellenbosch 95.48 and Cadi Ayyad 78.61. The high scores for the South African institutions reflect a much lower mean score than in the world rankings.
The fall in Cadi Ayyad's citation score from 3.6 points below Cape Town to 21.3 below and its falling behind Stellenbosch and Witwatersrand presumably reflect the impact of fractionalised counting.
This suggests that if fractionalised counting is used in the coming World University Rankings many small or specialised institutions will suffer and there will be a lot of reshuffling.
Thursday, July 09, 2015
The Top University in Africa is ...
... the University of Cape Town. What a surprise!
Times Higher Education (THE) has produced another "snapshot" ranking. This one is a list of 15 African universities ranked according to "research influence", that is the number of citations per paper normalised by field and year. It seems that a larger list will be published at a THE summit at the university of Johannesburg scheduled for the end of this month. Then, apparently, there will be discussions about full rankings with a broad range of indicators.
This is a smart move. Apart from diluting the impact of the QS BRICS rankings, this table puts the summit host in the top ten and gets attention from around the continent with three places in the north, two in the west and two in the east in the top fifteen.
Here is the top 15:
1. University of Cape Town, South Africa
2. University of the Witwatersrand, South Africa
3. Makerere University, Uganda
4. Stellenbosch University, South Africa
5. University of Kwazulu-Natal, South Africa
6. University of Port Harcourt, Nigeria
7. University of the Western Cape, South Africa
8. University of Nairobi, Kenya
9. University of Johannesburg, South Africa
10. Universite Cadi Ayyad, Morrocco
11. University of Pretoria, South Africa
12. University of Ghana
13. University of South Africa
14. Suez Canal University, Egypt
15. Universite Hassan II, Morrocco.
This is, of course, just one indicator but even so there will be a few academic eyebrows rising around the continent. Makerere has a good national and regional reputation but does it have more research influence than all but two South African universities?
How come Suez Canal University is there but not Cairo University or the American University in Cairo? And I am sure that in Nigeria there will be a lot of smirking around Ahmadu Bello and Ibadan Universities about Port Harcourt in sixth place.
One very good thing about this "experimental and preliminary ranking" is that THE and data provider Scopus are now using fractionalised counting of citations, so that if 100 universities contribute to a publication they each get credit for one hundredth of the citations.
That has not stopped Makerere and Port Harcourt from getting a boost, perhaps too much of a boost, for taking part in a huge multinational medical study but it has reduced the distortions that this indicator can cause.
So, for once, well done THE!... Now, what about taking a look at secondary affiliations?
Saturday, July 04, 2015
Is this really happening?
If this continues France will be the least intelligent country in the world in a century.
Drawing straight lines on graphs and getting excited about tiny samples can be dangerous. Even so, this is a little frightening.
James Thompson's blog notes a study by Edward Dutton and Richard Lynn that suggests that the French national average IQ declined by nearly 4 points in a decade. The sample size was only 79 so we should not start panicking too much until there are a few more studies. It will be interesting to see if this one is replicated.
Drawing straight lines on graphs and getting excited about tiny samples can be dangerous. Even so, this is a little frightening.
James Thompson's blog notes a study by Edward Dutton and Richard Lynn that suggests that the French national average IQ declined by nearly 4 points in a decade. The sample size was only 79 so we should not start panicking too much until there are a few more studies. It will be interesting to see if this one is replicated.
Thursday, July 02, 2015
Now the British Academy Sees a Problem
Yesterday I referred to the poor numeracy skills of British (England and Northern Ireland) tertiary graduates reported by the OECD.
Now the British Academy has had its say. It reports that the performance of British school pupils is mediocre and that many undergraduate students are weak in statistics.
But it looks like middling (compared to the OECD average) secondary school students become almost rock bottom (in the OECD) tertiary graduates. Could it be that British universities are actually subtracting relative value from their students?
The Academy notes:
The Academy proposes a strategy that includes improving the quality of quantitative skills teaching, reviewing school curricula and addressing the early dropping of maths by secondary school students.
I suspect that that will be insufficient.
Now the British Academy has had its say. It reports that the performance of British school pupils is mediocre and that many undergraduate students are weak in statistics.
But it looks like middling (compared to the OECD average) secondary school students become almost rock bottom (in the OECD) tertiary graduates. Could it be that British universities are actually subtracting relative value from their students?
The Academy notes:
Our school pupils tend to be ranked only in the middle of developed nations in mathematics. Our undergraduates embark on degree courses with varying, and often weak, fluency in statistics. And, in the workplace, demand for more advanced quantitative skills has risen sharply in the past two decades.Perhaps this has something to do with relatively high graduation rates at British universities so that mediocre students with weak numeracy skills will be recorded as tertiary graduates while their counterparts in most of the OECD will drop out and remain classified as secondary graduate. Even if that were the case the mediocrity of secondary students and tertiary graduates would still need to be addressed.
The Academy proposes a strategy that includes improving the quality of quantitative skills teaching, reviewing school curricula and addressing the early dropping of maths by secondary school students.
I suspect that that will be insufficient.
Wednesday, July 01, 2015
Today India, Tomorrow Japan, Then ....
The ranking businesses are extending their global tentacles. Times Higher Education has produced a "snapshot" MENA ranking that produced interesting results -- Texas A&M University Qatar top for research impact -- and will be announcing their world rankings from Melbourne.
Meanwhile, QS will be in India next week to reveal their latest BRICS rankings and has been getting attention in new places for its subject rankings that get to places other rankings won't go.
With QS getting more international, it is no surprise to hear that Mitsui & Co, Ltd, has purchased shares in QS:
Meanwhile, QS will be in India next week to reveal their latest BRICS rankings and has been getting attention in new places for its subject rankings that get to places other rankings won't go.
With QS getting more international, it is no surprise to hear that Mitsui & Co, Ltd, has purchased shares in QS:
'Nunzio Quacquarelli, CEO of QS said the investment from Mutsui “can especially support our development in Asia” adding, “we were seeking and have found a likeminded company which shares our long term vision” '.This is not the first sign of Mitsui's interest in tertiary education:
'Last year the company also invested $5m in Synergis Education, an education company specialising in online and on the ground adult learner programmes.
“We aim to use our experience in the online education field to create new services,” said Takeshi Akutsu, GM of Mitsui Service Business Division in a statement.
“At the same time, through this business we will help to nurture the global human resources needed by the global economy.” '
I wonder if QS will try and start ranking online courses.next.
Tuesday, June 30, 2015
What's the Problem with U-Multirank and AHELO?
In a recent post, I discussed the contrast between the poor skills of young people in the UK (strictly speaking Northern Ireland and England) and the high regard in which British universities are held by the brand name rankers.
There is a piece of data in the skills report from the OECD that is interesting in this respect. Figure 2.2 shows the average numeracy skills of new graduates (age 16-29, 2012). It is depressing reading. The data for tertiary graduates shows that only Italy does worse than the UK and Ireland is either the same or almost the same.The US is very slightly ahead. The top scorers are Austria, Flanders and the Czech Republic.
Something that should have everybody running around doing research and forming committees is that British tertiary graduates are only very slightly better than most European secondary graduates and slightly better than South Koreans with less than an upper secondary qualification.
It is possible, indeed quite probable, that British tertiary graduates do better on verbal skills and likely that they could conduct themselves well in interviews. Perhaps also, it is places like the University of East London and Bolton University that are dragging down the British average. But this dramatically poor performance is such a glaring contrast to the preening self satisfaction of the higher education establishment that some discussion at least is called for.
We may be seeing an explanation for the reluctance of the Russell Group and its orbiters and the Ivy League to cooperate with U-Multirank and their disdain for the AHELO project that is in marked contrast with their support for the trusted and prestigious THE rankings. They are quite happy to be assessed on reputation, resources, income and citations but comparison with the cognitive skills of graduates from the upstarts of East Asia and perhaps Eastern and Central Europe is something to be avoided.
There is a piece of data in the skills report from the OECD that is interesting in this respect. Figure 2.2 shows the average numeracy skills of new graduates (age 16-29, 2012). It is depressing reading. The data for tertiary graduates shows that only Italy does worse than the UK and Ireland is either the same or almost the same.The US is very slightly ahead. The top scorers are Austria, Flanders and the Czech Republic.
Something that should have everybody running around doing research and forming committees is that British tertiary graduates are only very slightly better than most European secondary graduates and slightly better than South Koreans with less than an upper secondary qualification.
It is possible, indeed quite probable, that British tertiary graduates do better on verbal skills and likely that they could conduct themselves well in interviews. Perhaps also, it is places like the University of East London and Bolton University that are dragging down the British average. But this dramatically poor performance is such a glaring contrast to the preening self satisfaction of the higher education establishment that some discussion at least is called for.
We may be seeing an explanation for the reluctance of the Russell Group and its orbiters and the Ivy League to cooperate with U-Multirank and their disdain for the AHELO project that is in marked contrast with their support for the trusted and prestigious THE rankings. They are quite happy to be assessed on reputation, resources, income and citations but comparison with the cognitive skills of graduates from the upstarts of East Asia and perhaps Eastern and Central Europe is something to be avoided.
Saturday, June 27, 2015
Why Russia Might Rise Fairly Quickly in the Rankings After Falling a Bit
An article by Alex Usher in Higher Education in Russia and Beyond, reprinted in University World News, suggests five structural reasons why Russian universities will not rise very quickly in the global rankings. These are:
- the concentration of resources in academies rather than universities
- excessive specialisation among existing universities
- a shortage of researchers caused by the economic crisis of the nineties
- excessive bureaucratic control over research projects
- limited fluency in English.
Over the next couple of years things might even get a bit worse. QS are considering introducing a sensible form of field normalisation, just for the five main subject groups. This might not happen since they are well aware of the further advantages this will give to English speaking universities, especially Oxbridge and places like Yale and Princeton, that are strong in the humanities and social sciences. But if it did it would not be good for Russian universities. Meanwhile, THE has spoken about doing something about hugely cited multi-authored physics papers and that could drastically affect institutions like MEPhI.
But after that, there are special features in the QS and THE world rankings that could be exploited by Russian universities.
Russia is surrounded by former Soviet countries where Russian is widely used and which could provide large numbers of international research collaborators, an indicator in the THE rankings, and could be a source of international students and faculty, indicators in the THE and QS rankings and a source of respondents to the THE and QS academic surveys.
Russia might also consider tapping the Chinese supply of bright students for STEM subjects. It is likely that the red bourgeoisie will start wondering about the wisdom of sending their heirs to universities that give academic credit for things like walking around with a mattress or not shaving armpit hair and think about a degree in engineering from Moscow State or MEPhI.
Russian universities also appear to have a strong bias towards applied sciences and vocational training that should, if marketed properly, produce high scores in the QS employer survey and the THE Industry Income: Innovation indicator.
Friday, June 26, 2015
Italy and France Accept Gaokao Scores
What will happen when universities find gaokao is a better predictor of academic ability than A levels or SAT?
This is from YIBADA.
"Up to 1,000 universities in France, Italy, and other 14 popular overseas destinations for Chinese applicants are now accepting national college entrance test scores or "gaokao" scores as admission criteria, according to a report published on Monday by MyOffer, a London-based online student placement portal.
The findings reflect the growing international recognition for China's national college entrance tests despite lagging behind other exams.
MyOffer, which helps international students with university placements, overseas internships and career development, released the study as this year's "gaokao" scores were announced in several parts of China.
Earlier reports claimed that "gaokao" test results were accepted in 20 countries and regions, but MyOffer's study has by far the most detailed findings available."
Sunday, June 21, 2015
Today, Cuba and China, Tomorrow North Korea?
Another sign of the growing desperation of American colleges to find international students to take the courses American students just won't take is the four Cuban students who will take the TOEFL in Havana a week from now. There are plans for the GRE to be offered in Cuba in October.
Saturday, June 20, 2015
The Implications of the University of San Francisco Accepting Gaokao scores.
The University of San Francisco has announced that it will admit a limited number of students on the basis of their scores on the Gaokao, the rigorous Chinese national university entrance exam, plus an interview and English language test in Beijing. The candidates will be spared the necessity of taking TOEFL prep courses and flying to Hong Kong or Singapore for the SAT test.
American and British universities are running out of students capable of taking tertiary education courses. Average cognitive skills of local students are stagnant or declining, which explains the obsession of universities with finding students from overseas to bring in revenue and balance the books. China appears to have a large number of students capable of high achievement in numeracy-based fields.
What would happen if American universities found that Gaokao scores were more predictive of academic success than a dumbed down SAT? What if the English language component turned out to be just as good a measure of language proficiency as IELTS or TOEFL? The consequence might be that the Gaokao could become the normal route for admission to universities outside China.
And looking ahead several decades, what would happen if the Gaokao was offered in languages other than Chinese with test centres being set up outside China?
American and British universities are running out of students capable of taking tertiary education courses. Average cognitive skills of local students are stagnant or declining, which explains the obsession of universities with finding students from overseas to bring in revenue and balance the books. China appears to have a large number of students capable of high achievement in numeracy-based fields.
What would happen if American universities found that Gaokao scores were more predictive of academic success than a dumbed down SAT? What if the English language component turned out to be just as good a measure of language proficiency as IELTS or TOEFL? The consequence might be that the Gaokao could become the normal route for admission to universities outside China.
And looking ahead several decades, what would happen if the Gaokao was offered in languages other than Chinese with test centres being set up outside China?
Thursday, June 18, 2015
Which is the Real Fraud?
The Australian via Inside higher Ed has an article by Kylar Loussikian about a shadowy organisation apparently based in Colchester, England, that supplies ghostwritten academic essays. Australian universities, and maybe others, are getting very concerned about the racket.
'The most common issue, ghostwritten essays, represents a “wicked problem,” said John Shields, deputy dean of the University of Sydney’s business school. “It’s deep and embedded and it’s hard to catch and kill,” he said. “In one sense, ghostwriting has emerged as an area of key concern in academic honesty because many universities are using a first-line defense in terms of [text matching software], and the simple plagiarism approach being detectable has forced those who, for whatever reason, choose to engage in dishonest conduct, to go one level deeper.” '
No doubt there will be a lot of finger pointing and tongue wagging. But are companies like these the real frauds? When millions of students are unable to do the work in courses for which they have been selected shouldn't we conclude that the entire admission process is flawed?
Why are there people capable of turning out essays and papers at a few hours or days notice not employed in universities? Doesn't this suggest that that there is a problem with the recruitment process?
Meanwhile the ghost writing virus seems to be spreading to graduate and faculty research. In the last few weeks I have received messages from Gulf Dissertation Online, which has "expertly helped and consulted PhD Professors, Lecturers and Scholars with their Thesis, Dissertations and Research Papers for over 12 Years" and Publish Pedia, which "is now offering a unique opportunity to Scholars and Professors who are pursuing their first publication ISI indexed journal or due to insufficient time not able to follow up on their new papers for publication to high impact factor top tier journals keeping the mandatory guidelines for ISI journal approved by the University"
Tuesday, June 16, 2015
The British Paradox Again
We have been told many times before that British universities are punching above their weight and are outperforming their international counterparts. Year after year they do extremely well in the QS and THE world rankings although perhaps not as well in the Shanghai ARWU.
This excellent performance is in glaring contrast to the well documented decline in the cognitive skills of young people in the United Kingdom. A recent publication from the OECD on youth, skills and employability show that the proportion of 16-39 olds in the UK (actually England and Northern Ireland) with low literacy skills was well above the OECD average and slightly above the United States. Only Spain and Italy did worse. Not unexpectedly, the top performers here were Japan, Korea and Finland.
What is even more frightening is that the UK is very distinctive in that the proportion of 16-29 olds with poor literacy skills is lower than that of 30-54 years. In every other country except Japan, where literacy is very high among both groups, and Norway, literacy has risen among the younger generation.
For numeracy skills of 16-29 year olds, the UK is again well below the OECD average. The share of young people with limited numeracy is higher than any other country except Italy and the US. Again there is a decline from the 30-54 year olds.
The OECD has also published data on problem solving abilities in technology-rich environments. This time the UK, like every country assessed, has improved over time but still is behind everyone else except the US, Ireland and Poland.
So how can British students be so bad at literacy, numeracy and problem solving when the universities are, according to international rankers, so brilliant?
Some suggestions.
Perhaps, the rankings are biased towards British universities.
Perhaps, British higher education is highly differentiated with a few outstanding institutions that get high scores in the global league tables and a mass of others that cannot even squeeze into the 400s or 500s or do not even try.
Perhaps, it is just a question of time and in the next few years British universities will collapse under the weight of thousands of students with low cognitive skills who must be admitted to keep revenues flowing.
This excellent performance is in glaring contrast to the well documented decline in the cognitive skills of young people in the United Kingdom. A recent publication from the OECD on youth, skills and employability show that the proportion of 16-39 olds in the UK (actually England and Northern Ireland) with low literacy skills was well above the OECD average and slightly above the United States. Only Spain and Italy did worse. Not unexpectedly, the top performers here were Japan, Korea and Finland.
What is even more frightening is that the UK is very distinctive in that the proportion of 16-29 olds with poor literacy skills is lower than that of 30-54 years. In every other country except Japan, where literacy is very high among both groups, and Norway, literacy has risen among the younger generation.
For numeracy skills of 16-29 year olds, the UK is again well below the OECD average. The share of young people with limited numeracy is higher than any other country except Italy and the US. Again there is a decline from the 30-54 year olds.
The OECD has also published data on problem solving abilities in technology-rich environments. This time the UK, like every country assessed, has improved over time but still is behind everyone else except the US, Ireland and Poland.
So how can British students be so bad at literacy, numeracy and problem solving when the universities are, according to international rankers, so brilliant?
Some suggestions.
Perhaps, the rankings are biased towards British universities.
Perhaps, British higher education is highly differentiated with a few outstanding institutions that get high scores in the global league tables and a mass of others that cannot even squeeze into the 400s or 500s or do not even try.
Perhaps, it is just a question of time and in the next few years British universities will collapse under the weight of thousands of students with low cognitive skills who must be admitted to keep revenues flowing.
Monday, June 08, 2015
Why is Bogazici University considered so great in Turkey although it actually is at 400th position in the QS world rankings?
Another question from Quora.
The answer is that the QS rankings favour universities with an established reputation in those countries that are interested in rankings, those that have extensive international linkages, those with a lot of faculty and those with strengths in medical research.
In contrast, the Times Higher Education (THE) rankings favour those powered by hadron driven citations and with the good fortune to be located in countries where most universities produce few citations.
What will happen if THE does reform its citations indicator?
The answer is that the QS rankings favour universities with an established reputation in those countries that are interested in rankings, those that have extensive international linkages, those with a lot of faculty and those with strengths in medical research.
In contrast, the Times Higher Education (THE) rankings favour those powered by hadron driven citations and with the good fortune to be located in countries where most universities produce few citations.
What will happen if THE does reform its citations indicator?
Is The QS Computer Scence Ranking Accurate?
Ben Zhao, Professor at UC Santa Barbara, doesn't think so.
I wouldn't disagree with him about the QS subject rankings, which outside the ranks of the world elite are based on very small samples of employers and academics and small numbers of citations. But it might be unfair to complain about being spammed all the time. This is probably happening because many universities are submitting his name to QS for the academic opinion survey.
As Oscar Wilde probably would have said the only thing worse than being spammed is not being spammed.
"There's a bunch of rankings, US News, Shanghai, US National Research Council, QS. Of all of these, I would probably say that QS is one of the least useful. Why do I say that? I get SPAMMED on multiple email addresses to respond to a survey on QS university rankings. I don't respond, and they just send more mail. This is NOT the behavior of a reputable organization trying to gather a legitimate view of universities and their research quality. ... "
I wouldn't disagree with him about the QS subject rankings, which outside the ranks of the world elite are based on very small samples of employers and academics and small numbers of citations. But it might be unfair to complain about being spammed all the time. This is probably happening because many universities are submitting his name to QS for the academic opinion survey.
As Oscar Wilde probably would have said the only thing worse than being spammed is not being spammed.
Wednesday, June 03, 2015
What do Indian Scientists do on Their Holidays?
The Indian Express has an interesting interview with the Vice-Chancellor of Panjab University, which Times Higher Education (THE), but nobody else, considers to be the best or second best university in India, a feat achieved by an outstanding score for citations.
Here is an extract:
Here is an extract:
"Did the four-year period, 2010-2014, counted for the Times ranking include old research papers as well?Yes. It is not about papers that came out in this period but also the papers in which PU figures and which have a high citation. It is a mix of so many things. God particle came up in 2012. So, all those papers are being cited multiple times. Every theorist is cited. So, PU was already doing well, and discovery of God particle made it even better. When there was a lull and Fermilab was closed down for a while, and they were re-building CERN, PU and TIFR went on and joined the groups in B-factory in Japan.
The thing is that you have a job in the university, you have a job for life, you can decide to sleep, still you will get the salary. These professors at PU, or those at IIT-Guwahati, TIFR people, they are conscious that their productivity should not suffer. They should continuously be valued as a member of these collaborations. So, they keep working. So, when there is a holiday, when [other] people spend time here and there,what do High energy physicists do? Class khatam hoti hai. The next day they take a flight, and go to CERN or Chicago, and there they work hard. You are actually trying to make up for the time you could not do anything because you were doing teaching. That is how international faculty values them also, and they are continuously being included."
So, the Vice-Chancellor is aware that it is the CERN project that is cause of PU's ranking success. It will be interesting to see what happens if THE does bite the unpleasant tasting bullet and introduce fractionated counting of citations.
But if PU and other Indian institutions continue to improve, even if there is a (temporary?) dip in the THE rankings, then the key to that success may be here. Indian scientists can draw a salary while sleeping if they want but they can also go to Switzerland and discover the fundamental particles of the universe if so inclined. Increasingly, western scientists are apparently expected to spend their days and nights filling out forms, applying for grants, writing teaching philosophies, attending sexual harassment seminars, making safe spaces all over the place, undergoing diversity sensitivity training and so on and so on.
Friday, May 29, 2015
University Ranking Challenge: Your starter for 5,154
Phil Baty, editor of the Times Higher Education World University Rankings, has indicated that the publication of a paper from the ATLAS and CMS experiments at the CERN Large Hadron Collider project is a challenge for rankers.
The paper in question has a total of 5,154 authors, if that is the right word, with sole or primary affiliation to 344 institutions. Of those authors 104 have a secondary affiliation. One is deceased. Under THE's current methodology every institution contributing to the paper will get credit for all the citations that the paper will receive, which is very likely to run into the thousands.
For the elite universities participating in these projects a few thousand citations will make little or no difference. But for for a small specialised institution or a large one that does little research, those citations spread out over a few hundred papers could make a big difference.
In last year's rankings places like Florida Institute of Technology, Universite Marrakesh Cadi Ayyad, Morroco, Federico Santa Maria Technical University, Chile, Bogazici University, Turkey, got implausibly high scores for citations that were were well ahead of those for the other criteria.
The paper in question does set a record for the number of contributors although the challenge is not particularly new.
At a seminar in Moscow earlier this year, Baty suggested that THE, now independent of Thomson Reuters, was considering using fractionated counting, dividing all the citations among the contributing institutions.
This would be an excellent idea and should be technically quite feasible since CWTS at Leiden University use it as their default option.
But there would be a a price to pay. The current methodology allows THE to boast that it has found a way of uncovering hitherto unnoticed pockets of excellence. It is also a selling point in THE's imperial designs of expanding into regions where there has so far been little interest in ranking, Russia, the Middle East, Africa, the BRICS. A few universities in those regions could make a splash in the rankings if they recruited, even as an adjunct, a researcher working on the LHC project.
It would be most welcome if THE does start using fractionated counting in its citation indication. Also welcome would be some other changes: not counting self-citation, reducing the weighting for the indicator, including several different methods of evaluating research impact or quality, and, especially important, getting rid of the "regional modification" that awards a bonus for being located in a low scoring country.
The paper in question has a total of 5,154 authors, if that is the right word, with sole or primary affiliation to 344 institutions. Of those authors 104 have a secondary affiliation. One is deceased. Under THE's current methodology every institution contributing to the paper will get credit for all the citations that the paper will receive, which is very likely to run into the thousands.
For the elite universities participating in these projects a few thousand citations will make little or no difference. But for for a small specialised institution or a large one that does little research, those citations spread out over a few hundred papers could make a big difference.
In last year's rankings places like Florida Institute of Technology, Universite Marrakesh Cadi Ayyad, Morroco, Federico Santa Maria Technical University, Chile, Bogazici University, Turkey, got implausibly high scores for citations that were were well ahead of those for the other criteria.
The paper in question does set a record for the number of contributors although the challenge is not particularly new.
At a seminar in Moscow earlier this year, Baty suggested that THE, now independent of Thomson Reuters, was considering using fractionated counting, dividing all the citations among the contributing institutions.
This would be an excellent idea and should be technically quite feasible since CWTS at Leiden University use it as their default option.
But there would be a a price to pay. The current methodology allows THE to boast that it has found a way of uncovering hitherto unnoticed pockets of excellence. It is also a selling point in THE's imperial designs of expanding into regions where there has so far been little interest in ranking, Russia, the Middle East, Africa, the BRICS. A few universities in those regions could make a splash in the rankings if they recruited, even as an adjunct, a researcher working on the LHC project.
It would be most welcome if THE does start using fractionated counting in its citation indication. Also welcome would be some other changes: not counting self-citation, reducing the weighting for the indicator, including several different methods of evaluating research impact or quality, and, especially important, getting rid of the "regional modification" that awards a bonus for being located in a low scoring country.
Friday, May 22, 2015
An Experiment Using LinkedIn Data to Rank Arab Universities
University World News recently published an article by Rahul Choudaha suggesting that LinkedIn is the future of global rankings. At the moment that sounds a bit exaggerated and LinkedIn in its present form may be gone in a decade but he could be on to something.
Leaving Europe, North America and East Asia aside, the reliability of institutional data is very low and that makes serious evaluation of graduate outcomes, staff quality, income, teaching resources and so on extremely difficult.
This problem is especially acute for the Middle East and North Africa region where there appears to be a big demand for university rankings but little accurate information. The consequence has been some highly implausible results in the rankings attempted so far. Last year THE produced a "snapshot"of a ranking indicator which put Texas A&M Qatar as the top university for research impact.and QS's pilot rankings have the American University of Sharjah in joint first place for academic reputation, Al-Nahrain University top for faculty student ratio and Khalifa University top for papers per faculty.
So, here is a list of Arab universities ordered by the number of students or professionals putting them on the Decision Board, indicating an interest in attending Counting was done on the 14th of May.
If this approximates to reputation among students and the public then it seems that Egyptian universities have been undervalued n previous ranking exercises.
Rank | University | Country | Interested in attending |
---|---|---|---|
1 | Helwan University | Egypt | 422 |
2 | American University in Cairo | Egypt | 394 |
3 | Arab Academy of Science, Technology and Maritime Transport | Egypt | 359 |
4 | Cairo University | Egypt | 353 |
5 | Ain Shams University | Egypt | 245 |
6 | Alexandria University | Egypt | 230 |
7 | King Fahd University of Petroleum and Minerals | Saudi Arabia | 211 |
8 | American University of Beirut | Lebanon | 193 |
9 | École Nationale Polytechnique d'Alger | Algeria | 184 |
10 | King Saud University | Saudi Arabia | 138 |
11 | Lebanese American University | Lebanon | 133 |
12 | American University in Dubai | UAE | 131 |
13 | Qatar University | Qatar | 102 |
14 | American University of Sharjah | UAE | 91 |
15 | King Abdullah University of Science and Technology | Saudi Arabia | 85 |
16= | Al Azhar University | Egypt | 78 |
16= | University of Dubai | UAE | 78 |
18 | Damascus University | Syria | 73 |
19 | University of Dammam | Saudi Arabia | 70 |
20= | Mansoura Univerdity | Egypt | 68 |
20= | Houari Boumediene University of Science and Technology | Algeria | 68 |
22 | UAE University | UAE | 62 |
23 | Higher Colleges of Technology | UAE | 58 |
24= | Tanta University | Egypt | 51 |
24= | German University in Cairo | Egypt | 51 |
26 | Zagazig University | Egypt | 50 |
27= | Suez Canal University | Egypt | 43 |
27= | King Abdulaziz University | Saudi Arabia | 43 |
27= | Umm Al-Qura University | Saudi Arabia | 43 |
30= | Abu Dhabi UNiversity | UAE | 33 |
30= | Ajman University of Science & Technology | UAE | 33 |
32 | Assiut Universit | Egypt | 32 |
33 | Université Mentouri de Constantine | Algeria | 27 |
34 | Université Libanaise | Lebanon | 26 |
35 | Al-Imam Mohamed Ibn Saud Islamic University | Saudi Arabia | 23 |
36 | Université Saad Dahlab Blida | Algeria | 22 |
37 | Prince Sultan University | Saudi Arabia | 21 |
38= | King Faisal University | Saudi Arabia | 20 |
38= | Université Mouloud Mammeri de Tizi Ouzo | Algeria | 20 |
40 | Université Badji Mokhtar de Annaba | Algeria | 19 |
41 | Khalifa University | UAE | 19 |
42= | Université de Batna | Algeria | 18 |
42= | Université Cadi Ayyad Marrakech | Morocco | 18 |
44= | King Khalid University | Saudi Arabia | 17 |
44= | Sanaa University | Yemen | 17 |
46 | University of Bejaia | Jordan | 16 |
47= | Zayed University | UAE | 14 |
47= | Université Sidi Mohammed Ben Abdellah | Morocco | 14 |
49= | Masdar Intiute of Science and Technology | UAE | 13 |
49= | Université d'Oran | Algeria | 13 |
51= | Yarmouk University | Jordan | 12 |
51= | Universite de Tunis El Manar | Tunisia | 12 |
53= | Texas A&M Qatar | Qater | 11 |
53= | University of Sharjah | UAE | 11 |
53= | Minia University | Egypt | 11 |
53= | University of Tunis | Tunisia | 11 |
53= | Universite de Monastir | Tunisia | 11 |
58= | University of Jordan | Jordan | 10 |
58= | Benha University | Egypt | 10 |
58= | University of Bahrain | Bahrain | 10 |
61 | Taif University | Saudi Arabia | 0 |
62 | Kuwait University | Kuwait | 0 |
63 | University of Baghdad | Iraq | 0 |
64 | University of Khartoum | Sudan | 0 |
65 | Jordan University of Science and Technology | Jordan | 0 |
66 | Mosul University | Iraq | 0 |
67 | Qassim University | Saudi Arabia | 0 |
68 | Taibah University | Saudi Arabia | 0 |
69 | Hashemite University | Jordan | 0 |
70 | Université Abou Bekr Belkaid Tlemcen | Algeria | 0 |
71 | Al Balqa Applied University | Algeria | 0 |
72 | Babylon University | Iraq | 0 |
73 | South Valley University | Egypt | 0 |
74 | Meoufia University | Egypt | 0 |
75 | Fayoum University | Egypt | 0 |
76 | Sohag University | Egypt | 0 |
77 | Beni-Suef University | Egypt | 0 |
78 | Jazan University | Saudi Arabia | 0 |
79 | Universite de Sfax | Tunisia | 0 |
80 | Al Nahrain University | Iraq | 0 |
81 | University of Basrah | Iraq | 0 |
82 | King Saud bin Abdulaziz University for Health Sciences | Saudi Arabia | 0 |
83 | Université Mohammed V Agdal | Morocco | 0 |
84 | Alfaisal University | Saudi Arabia | 0 |
85 | Arabian Gulf University | Bahrain | 0 |
86= | Petroleum Institute Abu Dhabi | UAE | 0 |
86= | National Engineering School of Sfax | Tunisia | 0 |
88 | Mutah University | Jordan | 0 |
89 | Kafrelsheikh University | Egypt | 0 |
90 | Université de Carthage (7 de Novembre) | Tunisia | 0 |
91 | University of Balamand | Lebanon | 0 |
92 | Beirut Arab University | Lebanon | 0 |
93 | Université Hassan II Mohammadia | Morocco | 0 |
94 | Universite de Sousse | Tunisia | 0 |
95 | Université Abdelmalek Essaadi | Morocco | 0 |
96 | Petra University | Jordan | 0 |
97 | Djillali Liabes University | Algeria | 0 |
98 | Université Ferhat Abbas Setif | Algeria | 0 |
99 | Princess Sumaya University for Technology | Jordan | 0 |
100 | Université de la Manouba | Tunisia | 0 |
101 | Université Ibn Tofail Kénitra | Morocco | 0 |
102 | Université Saint Joseph de Beyrouth | Lebanon | 0 |
103 | Université de Gabes | Tunisia | 0 |
104 | Université Mohammed Premier Oujda | Morocco | 0 |
105 | Mohamed Boudiaf University of Science and Technology | Algeria | 0 |
106 | Sultan Qaboos University | Oman | 0 |
Thursday, May 14, 2015
How to improve your total Contribution in the academic caldener.
I have received several invitations over the last few months to let a team of consultants write up my research and get me into an ISI or Scopus journal. The most recent was from something called Prime Journal Consultants. It is hard to believe that anyone could be so naive as to pay money to someone who writes so badly but who knows? Maybe Chris Olsen has got a doctorate now.
Or maybe standards at Scopus and Thomson Reuters journals are not what they used to be.
Anyway, here is the first part of the message.
"The Most valuable part of your research is the data and study that you have already conducted, its time now to use the study and with our expert assistance create a complete research paper out of it and get it published to the highest impact factor ISI or Scopus Indexed journals to earn Recognition and Promotion.
The Contribution of Research Article Publishing Towards your Promotion
Publication is both a measure of a scholar’s knowledge and also a benchmark for academic success. The minimum percentage for promotion in terms of Research Publication is at least 35-40% of your total Contribution in the academic caldener.
Common Misconception About ISI publishing -Book A Dedicated Consultant Today
ISI Publishing is a time consuming process, The Genuine ISI journals would take time after getting you through rigorous revisions and edits. That is where our Dedicated Consultants Come in to assist you take you through theentire steps to get you an ISI acceptance."
Wednesday, May 13, 2015
The March of Pseudoscience Stumbles a BIt
Pseudoscience continues to thrive in the West. Although -- I think -- no longer offered by universities, homeopathy is still viewed with favour by many in the British establishment, including the Prince of Wales, and has received official recognition in Canada.
Meanwhile in Malaysia Universiti Malaysia Pahang (UMP) has produced an anti-hysteria kit consisting of things like chopsticks, lime, salt, vinegar and pepper spray, which will repel evil spirits. The kit sells for Ringgit 8,750, which includes training and technical support
The Malaysian religious authorities have been more sceptical than the British royal family and treated the kits with derision. UMP has replied by claiming the kit was based on scientific research, although it has not said where the research was published
Meanwhile in Malaysia Universiti Malaysia Pahang (UMP) has produced an anti-hysteria kit consisting of things like chopsticks, lime, salt, vinegar and pepper spray, which will repel evil spirits. The kit sells for Ringgit 8,750, which includes training and technical support
The Malaysian religious authorities have been more sceptical than the British royal family and treated the kits with derision. UMP has replied by claiming the kit was based on scientific research, although it has not said where the research was published
Monday, May 11, 2015
The Geography of Excellence: the Importance of Weighting
So finally, the 2015 QS subject rankings were published. It seems that the first attempt was postponed when the original methodology produced implausible fluctuations, probably resulting from the volatility that is inevitable when there are a small number of data points -- citations and survey responses -- outside the top 50 for certain subjects.
QS have done some tweaking, some of it aimed at smoothing out the fluctuations in the responses to their academic and employer surveys.
These rankings look at bit different from the World University Rankings. Cambridge has the most top ten placings (31), followed by Oxford and Stanford (29 each), Harvard (28), Berkeley (26) and MIT (16).
But in the world rankings MIT is in first place, Cambridge second, Imperial College London third, Harvard fourth and Oxford and University College London joint fifth.
The subject rankings use two indicators from the world, the academic survey and the employer survey but not internationalisation, student faculty ratio and citations per faculty. They add two indicators, citations per paper and h-index.
The result is that the London colleges do less well in the subject rankings since they do not benefit from their large numbers of international students and faculty. Caltech, Princeton and Yale also do relatively badly probably because the new rankings do not take account of their low faculty student faculty ratios.
The lesson of this is that if weighting is not everything, it is definitely very important.
Below is a list of universities ordered by the number of top five placings. There are signs of the Asian advance -- Peking, Hong Kong and the National University of Singapore -- but it is an East Asian advance.
Europe is there too but it is Cold Europe -- Switzerland, Netherlands and Sweden -- not the Mediterranean.
Rank | University | Country | Number of Top Five Places |
---|---|---|---|
1 | Harvard | USA | 26 |
2 | Cambridge | UK | 20 |
3 | Oxford | UK | 18 |
4 | Stanford | USA | 17 |
5= | MIT | USA | 16 |
5= | UC Berkeley | USA | 16 |
7 | London School of Economics | UK | 7 |
8= | University College London | UK | 3 |
8= | ETH Zurich | Switzerland | 3 |
10= | New York University | USA | 2 |
10= | Yale | USA | 2 |
10= | Delft University of Technology | Netherlands | 2 |
10= | National University of Singapore | Singapore | 2 |
10= | UC Los Angeles | USA | 2 |
10= | UC Davis | USA | 2 |
10= | Cornell | USA | 2 |
10= | Wisconsin - Madison | USA | 2 |
10- | Michigan | USA | 2 |
10= | Imperial College London | UK | 2 |
20= | Wagenginen | Netherlands | 1 |
20= | University of Southern California | USA | 1 |
20= | Pratt Institute, New York | USA | 1 |
20= | Rhode Island School of Design | USA | 1 |
20= | Parsons: the New School for Design | USA | 1 |
20= | Royal College of Arts London | UK | 1 |
20= | Melbourne | Australia | 1 |
20= | Texas-Austin | USA | 1 |
20= | Sciences Po | France | 1 |
20= | Princeton | USA | 1 |
20= | Yale | USA | 1 |
20= | Chicago | USA | 1 |
20= | Manchester | UK | 1 |
20= | University of Pennsylvania | USA | 1 |
20= | Durham | UK | 1 |
20= | INSEAD | France | 1 |
20= | London Business School | UK | 1 |
20= | Northwestern | USA | 1 |
20= | Utrecht | Netherlands | 1 |
20= | Guelph | Canada | 1 |
20= | Royal Veterinary College London | UK | 1 |
20= | UC San Francisco | USA | 1 |
20= | Johns Hopkins | USA | 1 |
20= | KU Leuven | USA | 1 |
20= | Gothenburg | Sweden | 1 |
20= | Hong Kong | Hong Kong | 1 |
20= | Karolinska Institute | Sweden | 1 |
20= | Sussex | UK | 1 |
20= | Carnegie Mellon University | USA | 1 |
20= | Rutgers | USA | 1 |
20= | Pittsburgh | USA | 1 |
20= | Peking | China | 1 |
20= | Purdue | USA | 1 |
20= | Georgia Institute ofTechnology | USA | 1 |
20= | Edinburgh | UK | 1 |
Subscribe to:
Posts (Atom)