Times Higher Education (THE) has just published its ranking of universities in the MENA (Middle East and North Africa) region. It is, according to THE's rankings editor Phil Baty, "just a snapshot" of what MENA rankings might look like after consultation.with interested parties.
The ranking contains precisely one indicator, field-normalised citations, meaning that it is not the number of citations that matter but the number compared to the world average in specific fields. This was the flagship indicator in the THE world rankings and it is surprising that THE should continue using it in a region where it is inappropriate and produces extremely implausible results.
Number one in MENA is Texas A & M University at Qatar. This is basically an engineering school, evidently of a very high quality, and it is not clear whether is a genuinely independent institution. It offers undergraduate courses in engineering and has master's programmes in chemical engineering. Its output of research is meagre, as THE obligingly indicates in its press release.
How then did it get to the top of a research impact ranking? Easily. One of its faculty, with a joint appointment with the mother campus in Texas, is one of the collaborators on a multi-contributor paper emanating from CERN. I will leave it somebody else to count the number of contributors.
Another CERN collaborator, Cadi Ayyad University in Morocco is in sixth place. King Abdulaziz University is third.
There are ten Egyptian universities in the top thirty, including Alexandria but not Cairo.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Tuesday, February 24, 2015
Saturday, February 21, 2015
Slipping down the curve
The ETS Center for Research on Human Capital and Education has produced an analysis of the performance of American millenials (young adults born after 1980 and aged 16-34 at the time of assessment) on the Programme for the International Assessment of Adult Competencies (PIAAC) conducted by the OECD. The analysis may be over-optimistic in places but in general it is a devastating forecast of a coming crisis for American higher education and very probably for American society.
American millenials are by historical and international standards well educated, at least in terms of the number of years of schooling but also, on average less literate, less numerate and less able to solve problems than their international counterparts. To be blunt, they appear to be relatively less intelligent.
Let's start with the literacy scores for adults aged 16 to 65 tested by PIAAC, that is basically the adults making up the current work force.
The average score for OECD countries is 273. There is nothing unusual at the top -- Japan 296, Finland 288, the Netherlands 284.
The score for the USA is 270, just below average.and better than six OECD countries. Overall the USA is at the moment mediocre compared with other developed nations.
Turning to numeracy, the OECD average is 269. Once again the top is dominated by East Asia and the shores of the Baltic and North Seas: Japan (288), Finland (282), Flanders (280), the Netherlands (280). The USA at 253 is well below average. Only Italy and Spain have lower scores.
For problem solving in technology-rich environments, the USA, with a score of 277 is again below the OECD average of 283.
This is the current work force, below average for literacy and problem solving, well below average for numeracy. It includes many who will soon die or retire and will be replaced by the millenial and post-millennial generations.
Take a look at the millenials. The gap is widening. For literacy the 6 point gap between the OECD average and the USA for 16-65 year olds becomes 8 points for the millennials.
For numeracy, the 13 point gap for 16 -65 year olds has become 21 points for the millenials and for problem solving 6 points becomes 9.
The situation becomes bleaker when we look at those who fail to meet minimum proficiency standards. Fifty percent of US milllenials score below literacy level 3, 64% below numeracy level 3, figures exceeded only by Spain, and 56% below level 2 proficiency in problem solving, the worst among developed countries reporting data.
Nor is there any hope that there may be a recovery from the younger section of the cohort, those aged between 16 and 24. The literacy gap remains the same at eight points but the numeracy and problem solving gaps each increase by an additional point.
The report also emphasises the large and increasing gap between the high and low skilled. Here there is a big danger. A gap can be closed from two ends and in the US it is easy to drag down high achievers by curtailing Advanced Placement programs, grade inflation, removal of cognitive content from college courses, group projects, holistic admissions and assessment and so on. The problem is that the closing of domestic gaps in this way just widens the international gap.
American millenials are by historical and international standards well educated, at least in terms of the number of years of schooling but also, on average less literate, less numerate and less able to solve problems than their international counterparts. To be blunt, they appear to be relatively less intelligent.
Let's start with the literacy scores for adults aged 16 to 65 tested by PIAAC, that is basically the adults making up the current work force.
The average score for OECD countries is 273. There is nothing unusual at the top -- Japan 296, Finland 288, the Netherlands 284.
The score for the USA is 270, just below average.and better than six OECD countries. Overall the USA is at the moment mediocre compared with other developed nations.
Turning to numeracy, the OECD average is 269. Once again the top is dominated by East Asia and the shores of the Baltic and North Seas: Japan (288), Finland (282), Flanders (280), the Netherlands (280). The USA at 253 is well below average. Only Italy and Spain have lower scores.
For problem solving in technology-rich environments, the USA, with a score of 277 is again below the OECD average of 283.
This is the current work force, below average for literacy and problem solving, well below average for numeracy. It includes many who will soon die or retire and will be replaced by the millenial and post-millennial generations.
Take a look at the millenials. The gap is widening. For literacy the 6 point gap between the OECD average and the USA for 16-65 year olds becomes 8 points for the millennials.
For numeracy, the 13 point gap for 16 -65 year olds has become 21 points for the millenials and for problem solving 6 points becomes 9.
The situation becomes bleaker when we look at those who fail to meet minimum proficiency standards. Fifty percent of US milllenials score below literacy level 3, 64% below numeracy level 3, figures exceeded only by Spain, and 56% below level 2 proficiency in problem solving, the worst among developed countries reporting data.
Nor is there any hope that there may be a recovery from the younger section of the cohort, those aged between 16 and 24. The literacy gap remains the same at eight points but the numeracy and problem solving gaps each increase by an additional point.
The report also emphasises the large and increasing gap between the high and low skilled. Here there is a big danger. A gap can be closed from two ends and in the US it is easy to drag down high achievers by curtailing Advanced Placement programs, grade inflation, removal of cognitive content from college courses, group projects, holistic admissions and assessment and so on. The problem is that the closing of domestic gaps in this way just widens the international gap.
Wednesday, February 18, 2015
Free Speech Ranking
As the First Armoured Division made its way across Libya towards Tunisia at the end of 1942 and in early 1943, the troops were kept busy with early morning PT, lectures on "my county" and "the Everest expedition" and debates on things like "the Channel Tunnel would be a benefit". In his diary, my father, then a humble signalman, recounted another debate on whether "permanent conscription is a national asset: "Horace as usual made a vociferous speech and he said, ' There, bang goes my two stripes'"
The spectacle of soldiers in the middle of a war arguing against government policy with no more penalty than forfeiting two stripes -- if, in fact, Horace ever did lose them -- sounds slightly surreal today. Especially so, now that western schools, universities and other organisations appear to be becoming more and more hostile to "dangerous" ideas, a category that seems to be expanding relentlessly.
The British online magazine Spiked has just published its first Free Speech University Rankings, which are worth reading in detail.
These are actually ratings, not rankings, and divide universities into three categories:
Just a few examples:
Birkbeck University Students Union has apparently banned UKIP, because "homophobia, Islamophobia, disablism, xenophobia, misogyny, racism, fascism, and general discrimination [sic!] is rife amongst its members, supporters, officials, and prospective candidates". If that wasn't bad enough, "John Sullivan, UKIP candidate for Forest of Dean and West Gloucestershire, said that regular physical exercise for boys released tension and thus avoided homosexuality."
The University of East London Students Union has banned materials opposing unrestricted abortion because "any material displayed in the Union building should adhere to the principle of ‘safe space’ and which resolves to ‘ensure an accessible environment in which every student feels comfortable, safe and able to get involved in all aspects of the organisation free from intimidation or judgement".
The University of Warwick noting the protected characteristics of "age, disability, gender reassignment, race, religion or belief, sex, sexual orientation, pregnancy and maternity and marriage or civil partnership" prohibits "displaying material that is likely to cause offence to others" or "spreading malicious rumours or insulting someone."
The spectacle of soldiers in the middle of a war arguing against government policy with no more penalty than forfeiting two stripes -- if, in fact, Horace ever did lose them -- sounds slightly surreal today. Especially so, now that western schools, universities and other organisations appear to be becoming more and more hostile to "dangerous" ideas, a category that seems to be expanding relentlessly.
The British online magazine Spiked has just published its first Free Speech University Rankings, which are worth reading in detail.
These are actually ratings, not rankings, and divide universities into three categories:
- Red: has actively banned and censored ideas on campus
- Amber: has chilled free speech through intervention
- Green: has a hand-off approach to free speech.
Just a few examples:
Birkbeck University Students Union has apparently banned UKIP, because "homophobia, Islamophobia, disablism, xenophobia, misogyny, racism, fascism, and general discrimination [sic!] is rife amongst its members, supporters, officials, and prospective candidates". If that wasn't bad enough, "John Sullivan, UKIP candidate for Forest of Dean and West Gloucestershire, said that regular physical exercise for boys released tension and thus avoided homosexuality."
The University of East London Students Union has banned materials opposing unrestricted abortion because "any material displayed in the Union building should adhere to the principle of ‘safe space’ and which resolves to ‘ensure an accessible environment in which every student feels comfortable, safe and able to get involved in all aspects of the organisation free from intimidation or judgement".
The University of Warwick noting the protected characteristics of "age, disability, gender reassignment, race, religion or belief, sex, sexual orientation, pregnancy and maternity and marriage or civil partnership" prohibits "displaying material that is likely to cause offence to others" or "spreading malicious rumours or insulting someone."
Monday, February 09, 2015
Affiliation in the News Again
Haaretz has published a story about Ariel University, in the occupied West Bank, suggesting that it is offering to pay researchers for adding its name to papers and grant proposals.
The report may be biased and the offer, which seems to apply to only one field, is probably an attempt to get round local and international ostracism. It is a much less blatant attempt to buy affiliations and therefore citations than the wholesale distribution of part time contracts by King Abdulaziz University in Jeddah to researchers on the Thomson Reuters Highly Cited lists.
Another case of affiliation abuse was that of Mohamed El Naschie, formerly editor of the journal Chaos, Solitons & Fractals, and writer of many articles that were cited frequently by himself and a few friends. El Naschie was also fond of giving himself affiliations that had little or no substance: Cambridge where he was a Visiting Scholar, allowed to use the library and other facilities, the University of Surrey for no discernible reason, and Alexandria University with which he had a tenuous connection.
Most of El Naschie's affiliation did not mean very much. Cambridge was getting lots of citations anyway and did not need him. But Alexandria University produced a modest amount of research and El Naschie's self-citations went a long way and took Alexandria into the top 200 of the THE 2010 world university rankings.
This sort of thing is likely to continue, especially since there is now a stream of papers and reviews in physics and sometimes in medicine and genetics that have hundreds of contributors and scores of contributing institutions. A part-time contract with a contributor to the Review of Particle Physics that includes adding the institution as a secondary affiliation could give an enormous boost to citation counts, especially if they are field and year- normalised..
It would be a good idea for academic editors and publishers to review their policies about the listing of affiliations. Perhaps second (or more) affiliations should only be allowed if documentary evidence of a significant connection is provided.
Likewise rankers ought to think about not counting secondary affiliations, as Shanghai Center for World Class Universities did last year, or giving them a reduced weighting.
The report may be biased and the offer, which seems to apply to only one field, is probably an attempt to get round local and international ostracism. It is a much less blatant attempt to buy affiliations and therefore citations than the wholesale distribution of part time contracts by King Abdulaziz University in Jeddah to researchers on the Thomson Reuters Highly Cited lists.
Another case of affiliation abuse was that of Mohamed El Naschie, formerly editor of the journal Chaos, Solitons & Fractals, and writer of many articles that were cited frequently by himself and a few friends. El Naschie was also fond of giving himself affiliations that had little or no substance: Cambridge where he was a Visiting Scholar, allowed to use the library and other facilities, the University of Surrey for no discernible reason, and Alexandria University with which he had a tenuous connection.
Most of El Naschie's affiliation did not mean very much. Cambridge was getting lots of citations anyway and did not need him. But Alexandria University produced a modest amount of research and El Naschie's self-citations went a long way and took Alexandria into the top 200 of the THE 2010 world university rankings.
This sort of thing is likely to continue, especially since there is now a stream of papers and reviews in physics and sometimes in medicine and genetics that have hundreds of contributors and scores of contributing institutions. A part-time contract with a contributor to the Review of Particle Physics that includes adding the institution as a secondary affiliation could give an enormous boost to citation counts, especially if they are field and year- normalised..
It would be a good idea for academic editors and publishers to review their policies about the listing of affiliations. Perhaps second (or more) affiliations should only be allowed if documentary evidence of a significant connection is provided.
Likewise rankers ought to think about not counting secondary affiliations, as Shanghai Center for World Class Universities did last year, or giving them a reduced weighting.
Saturday, February 07, 2015
Ranking Universities is a Really Serious Business
It seems that Webometrics has been hacked. Let's hope the problem is sorted out soon.
Ranking Web of Universities was attacked by external hackers. They did published hate messages and they had access to the Ranking, changing significantly the rank of at least one university and altering the structure and arrangement of the system. We are trying to fix the problems and sincerely apologize for any inconvenience. We hope to be able to be back in a few days with very exciting news and updated information. Thanks for your patience.
Ranking Web of Universities was attacked by external hackers. They did published hate messages and they had access to the Ranking, changing significantly the rank of at least one university and altering the structure and arrangement of the system. We are trying to fix the problems and sincerely apologize for any inconvenience. We hope to be able to be back in a few days with very exciting news and updated information. Thanks for your patience.
Wednesday, February 04, 2015
New York Times said it was a really big gap closing. It didn't seem so big then.
[Apologies to Bob Dylan, 'Talkin' New York']
The world of education is obsessed with gaps. Every
time the PISA results come out there is renewed concern about the stubborn and
growing gap between the United States and some Asian and Eastern European
countries although the US should perhaps be congratulated for every large ethnic
group doing as well or better than its international counterparts.
At the same time, there is recurrent anguish over the
failure of African Americans and Hispanics to match the academic achievements
of Whites and (East?) Asians.
According to the New York Times (NYT),
the huge achievement gap between wealthy and poor American children is a major cause of the mediocre performance of the American economy. Just closing the American gap and going up a few
points in the PISA rankings would apparently boost the economy significantly and create
billions of tax revenues.
So how to do this? The NYT reports that a recent
study by the Washington Center for Equitable Growth claims that things like more early
childhood education, reducing lead paint exposure and letting students
sleep a bit more will do the trick.
And has anyone managed to close the gap? Yes, according to the study, Montgomery County in Maryland, an affluent, racially mixed county near
Washington DC,
"was able to reduce the gap and increase scores after instituting all-day kindergarten programs, reducing class size, investing in teacher development and reducing housing -based segregation in its schools and a host of other reforms, Montgomery County, Maryland was successful in both improving average achievement test scores and reducing achievement gaps. The percentage of 5th graders reading at or above the proficient level on the Maryland State Assessment rose for all racial and ethnic groups between 2003 and 2009. In addition, gaps between the disproportionately lower-income black and Hispanic students and the disproportionately higher-income white and Asian students narrowed.”
So we should all go to Montgomery County
to find out how to close the gap, bring America up to OECD or even Finnish or Korean
standards and achieve Chinese rates of economic growth?
Perhaps not.
The good news from Montgomery was a
little surprising because I was sure that I had read a story that painted a
rather less cheerful picture of the school system there.
Here it is. From the Washington Post of March 12, 2013. 'In Montgomery schools, the achievement gap widens in some areas', by Donna St George
Locals were baffled how the school system could spend so much money and still do so badly.
Anyway, here are some extracts from the report from Montgomery County itself.
Here it is. From the Washington Post of March 12, 2013. 'In Montgomery schools, the achievement gap widens in some areas', by Donna St George
"The achievement gap that separates white and Asian students from black and Latino students has grown wider in Montgomery County in several measures of academic success, according to a report released Tuesday."
“The 130-page report points to progress in five of 11 performance indicators in recent years. The school system improved on gaps in school readiness and high school graduation, for example. But disparities widened in advanced-level scores for state math exams in third, fifth and eighth grades. There were mixed results in two categories.”
' “We still rank as one of the top spenders nationally in education, and then to lose ground is extremely concerning,” said Council Vice President Craig Rice (D-Upcounty), who called for more urgency. “It just boggles my mind that this can be so far below the radar.” '
But evidently the preferred solution is more money.
Montgomery Superintendent Joshua P. Starr said he agrees with most of the analysis. He wrote,
“much of the $10 million the school system is seeking above mandatory funding levels in its budget proposal would help address achievement disparities, including 30 “focus” teachers to reduce class sizes in English and math at middle and high schools where students are struggling."
Anyway, here are some extracts from the report from Montgomery County itself.
“This report finds that since 2008 MCPS has made progress, but significant achievement gaps remain, particularly among measures of at-risk academic performance. Over the same period, MCPS also lost ground in narrowing the achievement gap among several measures of above grade level performance that align with MCPS’ Seven Keys initiative and the Common Core State Standards.”
and
“MCPS narrowed the achievement gap across five measures: school readiness, MSA proficiency, suspensions, academic ineligibility, and graduation rates. These gaps narrowed by increasing the performance of most subgroups while accelerating the performance of the lowest performing subgroups."
and
“MCPS achieved mixed or no progress in narrowing the gap on two measures: dropout rates and completion of USM or CTE program requirements among graduates. For these two measures, MCPS tended to narrow the gap by race and ethnicity, but did not achieve the same progress among service groups. “and
“MCPS’ achievement gap widened across four measures: MSA advanced scores, Algebra 1 completion by Grade 8 with C or higher, AP/IB performance, and SAT/ACT performance. Among these four measures of above grade level performance that align with MCPS’ Seven Keys, high performing subgroups made greater gains on these benchmarks than low performing subgroups, thus widening the gap. More specifically: • The MSA Advanced Gaps in Grade 3 narrowed across most subgroups for reading by 2-7% but widened for math by 5-33% from 2007 to 2012; the Grade 5 gaps narrowed across most subgroups for reading by 2-16% but widened for math by 3-37%, and the Grade 8 gaps widened for both reading and math by 9-56%. • The Algebra 1 by Grade 8 with C or Higher Gap widened by 7-19% by race, ethnicity, special education, and FARMS status from 2010 to 2012, but narrowed by 7% by ESOL status. • The AP/IB Performance Gap among graduates widened by 6-37% by race, ethnicity, and service group status from 2007 to 2012. • The SAT/ACT Performance Gap among graduates held constant by special education and ESOL status from 2010 to 2012, but increased by race, ethnicity, and income by 3-6%."
So. Montgomery County reduced the gap for grade 5 reading for most subgroups but it widened for math. By Grade 8 it widened for reading and math, as did the AP/IB performance gap and the SAT/ACT gap for most groups.
When we get to the Center for Equitable growth and the New York Times, only the grade 5 reading improvement remains and the failures in other areas have disappeared. How very careless of them.
When we get to the Center for Equitable growth and the New York Times, only the grade 5 reading improvement remains and the failures in other areas have disappeared. How very careless of them.
Monday, February 02, 2015
Quality and Bias in University Rankings
I have just finished reading a very interesting unpublished paper, 'Measuring University Quality' by Christopher Claassen of the University of Essex.
He finds that all the major international rankings tap to some extent an underlying unidimensional trait of university quality and that this is measured more accurately by the US News Best Global Universities, the Center for World University Rankings (Jeddah) and the Academic Ranking of World Universities (Shanghai).
He also finds that these rankings are not biased towards their home countries, in contrast to the Times Higher Education, QS and Webometrics rankings.
He finds that all the major international rankings tap to some extent an underlying unidimensional trait of university quality and that this is measured more accurately by the US News Best Global Universities, the Center for World University Rankings (Jeddah) and the Academic Ranking of World Universities (Shanghai).
He also finds that these rankings are not biased towards their home countries, in contrast to the Times Higher Education, QS and Webometrics rankings.
Friday, January 30, 2015
Who says university isn't worth it?
In Malaysia it might be.
According to a local blog the customary dowry ("hantaran," distinct from the religiously sanctioned "mas kahwin" which is very modest) paid to the family of the bride varies significantly according to the bride's level of education.
For a woman with UPSR ( primary school certificate) it is 2-4,000 Ringgit.
For SPM (secondary school certificate) holders it is 4-8,000 Ringgit.
For STPM holders (equivalent to 'A' levels) it is 8-12,000 Ringgit.
For degree holders it is 12-15,000 Ringgit.
For master's holders it is 15-20,000 Ringgit.
For Ph Ds it is 20-30,000 Ringgit.
As far as I know, there is no premium for international universities or for those with a high place in the global rankings. Yet.
According to a local blog the customary dowry ("hantaran," distinct from the religiously sanctioned "mas kahwin" which is very modest) paid to the family of the bride varies significantly according to the bride's level of education.
For a woman with UPSR ( primary school certificate) it is 2-4,000 Ringgit.
For SPM (secondary school certificate) holders it is 4-8,000 Ringgit.
For STPM holders (equivalent to 'A' levels) it is 8-12,000 Ringgit.
For degree holders it is 12-15,000 Ringgit.
For master's holders it is 15-20,000 Ringgit.
For Ph Ds it is 20-30,000 Ringgit.
As far as I know, there is no premium for international universities or for those with a high place in the global rankings. Yet.
Thursday, January 29, 2015
THE Under Fire in the Far North and Down Under
Times Higher Education (THE) has been attacked for the latest spin-off from its world rankings. Alex Usher of Higher Education Strategy Associates has tweeted "your "most international unis" rankings lack even the barest face validity. Galway more intl than Harvard? C'mon"
The higher education editor of the Australian, Julie Hare, reports that Australian observers are surprised that Monash university, reputed to be the most international Australian university, has been ranked so low and quotes a comment at THE: "What cretin can assert that LSE and Cambridge are less "international" than Brunel and Canterbury?"
I wonder if there will be similar comments on THE's preview, in advance of a summit in Qatar, of its forthcoming MENA rankings. This is the top five of a research impact indicator based on field and year normalised citations. The first place goes to Texas A and M University Qatar. The other four are the Lebanese American University, King Abdulaziz University, Jeddah, Qatar University and the American University of Beirut.
In case you are wondering, Texas A and M Qatar does have someone on the Large Hadron Collider project.
The higher education editor of the Australian, Julie Hare, reports that Australian observers are surprised that Monash university, reputed to be the most international Australian university, has been ranked so low and quotes a comment at THE: "What cretin can assert that LSE and Cambridge are less "international" than Brunel and Canterbury?"
I wonder if there will be similar comments on THE's preview, in advance of a summit in Qatar, of its forthcoming MENA rankings. This is the top five of a research impact indicator based on field and year normalised citations. The first place goes to Texas A and M University Qatar. The other four are the Lebanese American University, King Abdulaziz University, Jeddah, Qatar University and the American University of Beirut.
In case you are wondering, Texas A and M Qatar does have someone on the Large Hadron Collider project.
Wednesday, January 28, 2015
The Most International Universities
Times Higher Education has released its list of the top 100 most international universities. This is simply the International Outlook indicator extracted from last year's world rankings. It counts the proportion of international staff and faculty and the percentage of papers with international collaborators.
The top three are in Switzerland. The National University of Singapore is fourth and Ecole Polytechnique in Paris fifth. The Ecole is up from 28th place last year which probably means a "clarification of data" of some sort.
So what makes a university international?
It helps a lot to be located in an English speaking country. The UK, and Australia get high scores.
What is more noticeable is that many small countries to do well especially if they are located next to a big country with the the same or similar language and culture -- Switzerland, Singapore, Macau, Hong Kong, Austria, Denmark, Big countries like Mainland China, the USA and India do not.
Perhaps THE (and QS) should think about the implications of its methods. Does it make sense to count as international a student who moves a few miles from Fermanagh to Galway, Bavaria to Austria or Johor to Singapore?
Perhaps the THE should count the whole of the EU as a single country or give extra points for students and faculty who cross an ocean rather than an increasingly meaningless line on a map. Perhaps also, Macau and Hong Kong should be reunited methodologically with the Mainland. What about counting out of state students at US universities?
The top three are in Switzerland. The National University of Singapore is fourth and Ecole Polytechnique in Paris fifth. The Ecole is up from 28th place last year which probably means a "clarification of data" of some sort.
So what makes a university international?
It helps a lot to be located in an English speaking country. The UK, and Australia get high scores.
What is more noticeable is that many small countries to do well especially if they are located next to a big country with the the same or similar language and culture -- Switzerland, Singapore, Macau, Hong Kong, Austria, Denmark, Big countries like Mainland China, the USA and India do not.
Perhaps THE (and QS) should think about the implications of its methods. Does it make sense to count as international a student who moves a few miles from Fermanagh to Galway, Bavaria to Austria or Johor to Singapore?
Perhaps the THE should count the whole of the EU as a single country or give extra points for students and faculty who cross an ocean rather than an increasingly meaningless line on a map. Perhaps also, Macau and Hong Kong should be reunited methodologically with the Mainland. What about counting out of state students at US universities?
Thursday, January 22, 2015
More University Mission Creep
Demands on western universities continue to increase and so do calls for more and more indicators in national and global rankings. Universities, it seems, are underemployed if they just provide instruction in academic, professional and technical subjects and promote research and scholarship.
Now they are supposed to support diversity and inclusiveness, build character, grit and resilience, promote anti-racism, combat sexism, homophobia, cisgender normativity and weightism, boycott Israel, reward students for overcoming adversity, engage with communities, combat terrorism, transform lives, provide gender free bathrooms, sponsor near-professional level sports teams, boycott fossil fuels, make everybody safe and comfortable except for those whose privilege needs continued confrontation.
All this is now spilling over into the rankings business. We have already seen Universitas 21, which ranks national university systems, give countries a score for the number of female students and faculty.and there have been repeated proposals that the US News law school rankings should include faculty and student diversity among their criteria.
US News has published a diversity index that consists simply of calculating the percentage of minority students. First place goes to Cornell, followed by the University of Hawaii - Manoa, Whitter College, California, the University of the District of Columbia and Nova Southeastern University in Florida. A quick calculation of the correlation between the diversity index and overall scores in the law school rankings shows no significant relationship between diversity so defined and overall quality as measured by the rankings.
To incorporate such an index into the law school rankings would be a pointless exercise. If Nova Southeastern Unversity Law School, which accepts nearly half of those who apply and a third of whose graduates are not employed nine months after graduation, were to get a high ranking, this would be seriously misleading for everybody.
The US federal government is now proposing to rate colleges according to their admission of low income and first generation students, affordability, and outcomes such as graduation rates, graduate employment and entry into postgraduate programs. The problem here is that in the US and almost everywhere else these objectives are mutually exclusive. Low income students are, on average, likely to be less academically capable, and that means, if academic standards remain unchanged, that fewer will graduate and fewer will go on to graduate study.
If the ratings plan ever happens the likeliest consequence of the colliding demands is that it will become much easier to get a degree or get into graduate school. There are dozens of ways in which academic standards can be eroded, most of which we have seen already somewhere.
Another kind of creep is the rising chorus that universities should encourage and promote civic engagement, a rather slippery concept that is difficult to describe but covers a variety of worthy activities reaching beyond the campus such as promoting local economic development, employing women and minority groups, helping poor students succeed, buying local products and encouraging students to be volunteer teachers, . A recent conference in South Africa ended with a call for action that included a proposal that rankings should take account of such activities.
Adam Habib, Vice-Chancellor of the University of the Witwatersrand, even proposed to boycott rankings unless they included civic engagement as an indicator.
"Gather a group of universities and tell the rankings that you'll collectively withdraw if they don't take in civic engagement in the future. I guarantee that every one of them will listen".
But why should universities be required to do what other institutions have failed to do even though they are far better qualified. If entrepreneurs cannot promote economic growth, revolutionary parties cannot achieve social justice, trade unions cannot help the poor then just why should should universities be expected to do so?
Part of the drive for new indicators is probably rooted in the realisation that universities are losing much of their reason for existence. Research is increasingly done by specialised institutes, companies and hospitals, for- profit organisations offer no frills instruction at low prices, online learning is replacing traditional seminars and lectures. Civic engagement looks like the new quality audit, a way of keeping busy those who are reluctant to teach or research.
If all the demands for new indicators are met we will end up with hugely bloated rankings that fail to make any meaningful distinctions.
Now they are supposed to support diversity and inclusiveness, build character, grit and resilience, promote anti-racism, combat sexism, homophobia, cisgender normativity and weightism, boycott Israel, reward students for overcoming adversity, engage with communities, combat terrorism, transform lives, provide gender free bathrooms, sponsor near-professional level sports teams, boycott fossil fuels, make everybody safe and comfortable except for those whose privilege needs continued confrontation.
All this is now spilling over into the rankings business. We have already seen Universitas 21, which ranks national university systems, give countries a score for the number of female students and faculty.and there have been repeated proposals that the US News law school rankings should include faculty and student diversity among their criteria.
US News has published a diversity index that consists simply of calculating the percentage of minority students. First place goes to Cornell, followed by the University of Hawaii - Manoa, Whitter College, California, the University of the District of Columbia and Nova Southeastern University in Florida. A quick calculation of the correlation between the diversity index and overall scores in the law school rankings shows no significant relationship between diversity so defined and overall quality as measured by the rankings.
To incorporate such an index into the law school rankings would be a pointless exercise. If Nova Southeastern Unversity Law School, which accepts nearly half of those who apply and a third of whose graduates are not employed nine months after graduation, were to get a high ranking, this would be seriously misleading for everybody.
The US federal government is now proposing to rate colleges according to their admission of low income and first generation students, affordability, and outcomes such as graduation rates, graduate employment and entry into postgraduate programs. The problem here is that in the US and almost everywhere else these objectives are mutually exclusive. Low income students are, on average, likely to be less academically capable, and that means, if academic standards remain unchanged, that fewer will graduate and fewer will go on to graduate study.
If the ratings plan ever happens the likeliest consequence of the colliding demands is that it will become much easier to get a degree or get into graduate school. There are dozens of ways in which academic standards can be eroded, most of which we have seen already somewhere.
Another kind of creep is the rising chorus that universities should encourage and promote civic engagement, a rather slippery concept that is difficult to describe but covers a variety of worthy activities reaching beyond the campus such as promoting local economic development, employing women and minority groups, helping poor students succeed, buying local products and encouraging students to be volunteer teachers, . A recent conference in South Africa ended with a call for action that included a proposal that rankings should take account of such activities.
Adam Habib, Vice-Chancellor of the University of the Witwatersrand, even proposed to boycott rankings unless they included civic engagement as an indicator.
"Gather a group of universities and tell the rankings that you'll collectively withdraw if they don't take in civic engagement in the future. I guarantee that every one of them will listen".
But why should universities be required to do what other institutions have failed to do even though they are far better qualified. If entrepreneurs cannot promote economic growth, revolutionary parties cannot achieve social justice, trade unions cannot help the poor then just why should should universities be expected to do so?
Part of the drive for new indicators is probably rooted in the realisation that universities are losing much of their reason for existence. Research is increasingly done by specialised institutes, companies and hospitals, for- profit organisations offer no frills instruction at low prices, online learning is replacing traditional seminars and lectures. Civic engagement looks like the new quality audit, a way of keeping busy those who are reluctant to teach or research.
If all the demands for new indicators are met we will end up with hugely bloated rankings that fail to make any meaningful distinctions.
Saturday, January 17, 2015
New Resource from IREG
A new resource for anyone interested in university rankings is available at the International Rankings Experts Group (IREG) site.
The Inventory of National Rankings has been prepared by the Perspektywy Education Foundation, Poland, and provides basic data about a variety of national ranking systems. Two of them have been approved by IREG.
The Inventory of National Rankings has been prepared by the Perspektywy Education Foundation, Poland, and provides basic data about a variety of national ranking systems. Two of them have been approved by IREG.
Wednesday, January 07, 2015
US Federal Ratings Plan: A Few Answers, More Questions
The US Department of Education has just revealed the progress
that it has made towards its planned ratings for colleges and universities.
There has been over a year of public discussion since the Obama administration
announced that it was planning on introducing a new system. Unfortunately, it
seems that there is still a long way to go before a final product emerges and
the administration’s forecast of a launch in August or September 2015 may be too
optimistic.
Since the 1980s, the US
News& World Report’s ‘America’s Best Colleges’ has been followed avidly
by students and other stakeholders. These rankings have been criticised, sometimes
with justification, but they do provide a reasonably accurate guide to some of the
things that students, parents, employers and counsellors want to know: how
likely a student is to graduate once admitted, the typical academic ability of
fellow students, reputation among peers, resources available for teaching.
There is, of course, much that the US News rankings do not tell us. The international rankings produced
by Shanghai Jiao Tong University’s Center for World-Class Universities,
Quacquarelli Symonds (QS), Times Higher Education
and now US News with its Best Global
Universities are probably even more limited since they focus largely or
entirely on research and postgraduate training. There is also a widespread
feeling that existing rankings are unfair to schools that try to educate
students from non-traditional backgrounds or underrepresented groups.
The demand for more information and for greater accountability
comes when American universities are entering a time of increasing pressure and
constraint. Costs are rising inexorably, even though many students are taught
not by hugely expansive superstar professors but by poorly paid adjuncts and
untrained graduate assistants. Many students graduate late or not at all and
incur a large and growing debt burden from which bankruptcy rarely provides an
escape. Meanwhile the more reliable global university rankings show American
universities steadily losing ground to Asian institutions.
Many colleges and universities are facing a death spiral as stagnant
or declining admissions lead to a fall in the number of graduates which in turn
erodes reputation and undermines alumni contributions. Underlying everything is
the grim reality that the overall quality of graduate of American high schools
is apparently insufficient to supply colleges and universities with students
capable of completing a degree within a reasonable length of time.
The federal government has become increasingly concerned over
these trends and the failure of American higher education to provide a route to
secure employment and middle class status. The new plan had its origins in a
speech by President Obama at the University at Buffalo: SUNY in August 2013.
A succession of hearings and forums has been held and finally
the Department of Education has come out with a draft framework. The department
has indicated that it will publish ratings, not rankings, so that colleges and
universities will be divided into three categories, high performers, low
performers and those in between. Two year institutions such as community
colleges and four year colleges and universities will be assessed separately
and institutions that teach only postgraduates or do not grant degrees will not
be included. The main source of data will be information collected by the
federal government.
According to the document, ‘For Public feedback: A College
Ratings Framework’,
the objectives are to help colleges measure and make progress towards the
objectives of access, affordability and outcomes, to provide information for
students and families, and to help the government ensure that financial aid is
well used.
The department has announced the indicators it is thinking
about using. These include enrolment of low income and first generation
students, family income levels and the average net price of an institution. Student
outcomes could be measured by completion rates, transfer rates and the number
of students going on to graduate school. More details can be found in ‘A New System of College Ratings–
Invitation to Comment’.
Several questions remain unanswered. A rating consisting of
three categories may be too crude. There will almost certainly be a large gap
between those at the top of the high performing group and those on the edge of
the intermediate category. Membership of the same large group will not help
anyone trying to compare two universities. A small change in one or two
indicators might push colleges out of the intermediate group into the
underperformers where they could suffer financial sanctions.
If the department provides the scores or raw data for each
indicator then it would be relatively simple for analysts or journalists to
calculate numerical rankings.
The most disquieting thing about this document is that the
department seems to have given little thought about how easy it would be to
game much of the data. There are, for example, dozens of ways in which colleges
and universities could increase the number of students who graduate on time,
even if it means undermining the quality of their degrees and their value to
potential employers. Rating institutions according to the repayment of student
loans might encourage universities to close humanities and social science
departments while coaxing students into programs for which they might not be
suited.
It is likely that there will be more arguments and discussion
before the ratings are launched and it remains to be seen how much credibility
they will have when they do appear.
Friday, December 19, 2014
Study International
Study International is a new web site published by Hybrid News that will provide news, analysis and advice about international higher education. Some posts from this blog have been republished there, See for example.
Wednesday, December 10, 2014
Is Asia really rising?
Over the last few years there has been a lot of talk about the continuing rise of Asian universities. In The Conversation, Gerard Postiglione of the University of Hong Kong has pointed out that Asian universities now take one in eight of the top 200 places in the Times Higher Education World University Rankings and he predicts that by 2040 a quarter of the top universities will be Asian.
It is interesting that he apparently regards the THE rankings as the arbiter of excellence despite an eccentric methodology that, among other oddities, claims that an excellent but small research institute is the best university in Italy and among the best in the world.
Exactly what progress in the THE rankings means is difficult to decide. A rise in the score for the research indicator, for example, could result from an increase in the number of publications, a fall in the number of academic and/or research staff, an increase in research income, an improvement in the research reputation survey or a combination of some or all of these.
Predicting what will happen in the THE rankings has become even more difficult since THE broke up with their data supplier, Thomson Reuters, raising the possibility that there will be another round of methodological changes.
There are some sceptics such as Alex Usher of Higher Education Strategy Associates but in general the rise of Asia and the decline of the US and UK seems to have become part of the accepted wisdom of Western pontificators.
So is Asia rising? And if it is, is it the whole of the continent or just parts of it?
The problem is that the rankings vary in their ability to identify medium term trends. QS and THE give a large weighting to reputation surveys that are inherently volatile,They also use an unstable number of institutions to generate means from which processed scores are calculated and this can led to fluctuations in the final overall scores
The Academic Ranking of World Universities (ARWU) produced by the Shanghai Center for World-Class Universities is probably the most useful for identifying changes over the last decade since there were no significant changes between 2004 (when schools with strengths in the social sciences were helped by exemption from the Nature and Science indicator) and 2014 (when Thomson Reuters issued a new list of highly cited researchers).
The number of universities in the Shanghai top 500 provides strong evidence that some parts of Asia are making rapid progress. The number of mainland Chinese universities (excluding Hong Kong and Taiwan) has risen from 8 to 33, The number of Korean universities has gone from eight to ten, Taiwanese from three to six, Saudi from zero to four, and Malaysian from zero to two.
But some parts of Asia appear to be stagnant or in relative decline. The number of Japanese universities has fallen from 36 to 19 and Indian from three to one while the number of Hong Kong institutions has remained the same at five.
Looking at the performance of some national flagships in the ARWU publications indicator provides more evidence of an expansion of research in some Asian countries. Compared to Harvard's benchmark score of 100, Peking University has risen from 49.8 in 2004 to 63.6 in 2014. Other Asian universities have also had substantial growth over the decade. Seoul National University went from 62.6 to 67.8, National Taiwan University from 52.6 to 57.9 and Istanbul University from 30.7 to 34.9.
Note that the raw numbers of publications have been modified by a logarithm so that in 2004 Peking was in fact publishing about a quarter of the number of papers produced by Harvard rather than a half.
On the other hand, Tokyo University, which in 2004 had the second highest publications score in the world, fell from 91.9 to 73 and the University of Hong Kong from 46.4 to 44.
If we look at Shanghai's Productivity per Capita indicator, which measures quality by dividing five combined indicators by the number of faculty, we find some Asian universities doing well. Peking goes from 5.9 to 16.5, Seoul National University from 19 to 23.4 and National Taiwan University from 17.5 to 19.9. Tokyo, meanwhile, has fallen from 49.8 to 29.2. Hong Kong University, on the other hand,has risen from 13.1 to 22.4.
Confirmation of the trends for research output comes from the Output indicator in the Scimago rankings, which is based on data provided by Scopus. Peking, Seoul National University, National Taiwan University and Universiti Malaya rose between 2009 and 2014. However, the scores for Tokyo and Hong Kong both fell.
On the other hand, the evidence of Scimago' normalised impact indicator, which might measure research quality, shows Peking rising but Seoul National University and Hong Kong falling.
It would seem that China and the overseas Chinese communities and Korea are expanding the quantity of research but progress at higher levels is slower. There are also islands of research productivity in West and Southeast Asia. In Central Asia, the Indian Subcontinent and Indonesia there is very little significant research activity while Japan is actually declining.
It is interesting that he apparently regards the THE rankings as the arbiter of excellence despite an eccentric methodology that, among other oddities, claims that an excellent but small research institute is the best university in Italy and among the best in the world.
Exactly what progress in the THE rankings means is difficult to decide. A rise in the score for the research indicator, for example, could result from an increase in the number of publications, a fall in the number of academic and/or research staff, an increase in research income, an improvement in the research reputation survey or a combination of some or all of these.
Predicting what will happen in the THE rankings has become even more difficult since THE broke up with their data supplier, Thomson Reuters, raising the possibility that there will be another round of methodological changes.
There are some sceptics such as Alex Usher of Higher Education Strategy Associates but in general the rise of Asia and the decline of the US and UK seems to have become part of the accepted wisdom of Western pontificators.
So is Asia rising? And if it is, is it the whole of the continent or just parts of it?
The problem is that the rankings vary in their ability to identify medium term trends. QS and THE give a large weighting to reputation surveys that are inherently volatile,They also use an unstable number of institutions to generate means from which processed scores are calculated and this can led to fluctuations in the final overall scores
The Academic Ranking of World Universities (ARWU) produced by the Shanghai Center for World-Class Universities is probably the most useful for identifying changes over the last decade since there were no significant changes between 2004 (when schools with strengths in the social sciences were helped by exemption from the Nature and Science indicator) and 2014 (when Thomson Reuters issued a new list of highly cited researchers).
The number of universities in the Shanghai top 500 provides strong evidence that some parts of Asia are making rapid progress. The number of mainland Chinese universities (excluding Hong Kong and Taiwan) has risen from 8 to 33, The number of Korean universities has gone from eight to ten, Taiwanese from three to six, Saudi from zero to four, and Malaysian from zero to two.
But some parts of Asia appear to be stagnant or in relative decline. The number of Japanese universities has fallen from 36 to 19 and Indian from three to one while the number of Hong Kong institutions has remained the same at five.
Looking at the performance of some national flagships in the ARWU publications indicator provides more evidence of an expansion of research in some Asian countries. Compared to Harvard's benchmark score of 100, Peking University has risen from 49.8 in 2004 to 63.6 in 2014. Other Asian universities have also had substantial growth over the decade. Seoul National University went from 62.6 to 67.8, National Taiwan University from 52.6 to 57.9 and Istanbul University from 30.7 to 34.9.
Note that the raw numbers of publications have been modified by a logarithm so that in 2004 Peking was in fact publishing about a quarter of the number of papers produced by Harvard rather than a half.
On the other hand, Tokyo University, which in 2004 had the second highest publications score in the world, fell from 91.9 to 73 and the University of Hong Kong from 46.4 to 44.
If we look at Shanghai's Productivity per Capita indicator, which measures quality by dividing five combined indicators by the number of faculty, we find some Asian universities doing well. Peking goes from 5.9 to 16.5, Seoul National University from 19 to 23.4 and National Taiwan University from 17.5 to 19.9. Tokyo, meanwhile, has fallen from 49.8 to 29.2. Hong Kong University, on the other hand,has risen from 13.1 to 22.4.
Confirmation of the trends for research output comes from the Output indicator in the Scimago rankings, which is based on data provided by Scopus. Peking, Seoul National University, National Taiwan University and Universiti Malaya rose between 2009 and 2014. However, the scores for Tokyo and Hong Kong both fell.
On the other hand, the evidence of Scimago' normalised impact indicator, which might measure research quality, shows Peking rising but Seoul National University and Hong Kong falling.
It would seem that China and the overseas Chinese communities and Korea are expanding the quantity of research but progress at higher levels is slower. There are also islands of research productivity in West and Southeast Asia. In Central Asia, the Indian Subcontinent and Indonesia there is very little significant research activity while Japan is actually declining.
One Way of Rising in the Rankings
One way of rising in the rankings is to amalgamate. Watch out for Paris-Saclay in next year's Shanghai rankings.
From BBC Online News:
"Dominique Vernay, the president of this new university, says that within a decade he wants Paris-Saclay to be among the top ranking world universities.
"My goal is to be a top 10 institution," he says. In Europe, he wants Paris-Saclay to be in the "top two or three".
In university rankings, big is beautiful, and the Paris-Saclay will have 70,000 students and 10,000 researchers. There will be an emphasis on graduate courses and recruiting more international students and staff.
The idea of bringing together individual colleges into a "federal university" has been borrowed from the UK.
"Our model isn't that far from the Oxbridge model," he says.
To put it into scale, Mr Vernay says Paris-Saclay is going to be twice the size of the University of California, Berkeley, one of the flagships of the US university system."
Saturday, December 06, 2014
Ranking Status Wars 4
Meanwhile in Saudi Arabia, scholarships granted under the Custodian of the Two Holy Mosques Programme will go to students at 200 scientific universities chosen from the Big Four rankings, US News, Times Higher Education, QS and Shanghai.
Wednesday, December 03, 2014
Ranking Status Wars 3
The US News and World Report's Best Global Universities has been admitted to the ranks of the elite rankings. The Hong Kong government has announced a scholarship programme that will pay tuition and bursaries at universities in the top 100 of the QS world rankings, the THE world rankings, the Shanghai rankings and the USNWR global rankings.
Thursday, November 20, 2014
More on the Ranking Status Wars
The Economist thinks there are two international university rankings worth talking about. Will the prestigious THE rankings continue to be prestigious now they are no longer powered by Thomson Reuters but have to share their data partner with QS?
"But most universities still have far to go. Only two Chinese institutions number in the top 100 in the Times Higher Education World University Rankings. Shanghai’s Jiao Tong University includes only 32 institutions from mainland China among the world’s 500 best. The government frets about the failure of a Chinese scholar ever to win a Nobel prize in science (although the country has a laureate for literature and an—unwelcome—winner in 2010 of the Nobel peace prize, Liu Xiaobo, an imprisoned dissident)".
The Times [Higher Education rankings] they are a-changing
Maybe I'll get my five minutes of fame for being first with a Dylan quotation. I was a bit slow because, unlike Jonah Lehrer, I wanted to check that the quotation actually exists.
Times Higher Education (THE) have announced that they will be introducing reforms to their World University Rankings and ending their partnership with media and data giant, Thomson Reuters (TR).
Exactly why is not stated. It could be rooted in financial disagreement. Maybe THE feels betrayed because TR let US News use the reputation survey for their new Best Global Universities rankings. Perhaps THE got fed up with explaining why places like Bogazici University, Federico Santa Maria Technical University and Royal Holloway were world beaters for research impact, outshining Yale, Oxford and Cambridge,
The reputation survey will now be administered by THE itself in cooperation with Elsevier and will make use of the Scopus database. Institutional data will be collected from universities, the Scopus database and the Scival analysis tool by a new THE team.
The coming months will reveal what THE have in store but for now this is a list of recommendations. No doubt there will be many more from all sorts of people.
Display each indicator separately instead of lumping them together into Teaching, Research and International Outlook. It is impossible to work out exactly what is causing a rise or fall in the rankings unless they are separated.
Try to find some why of reducing the volatility of the reputation survey. US News do this by using a five year average and QS by rolling over unchanged responses for a further two years.
Consider including questions about undergraduate teaching or doing another survey to assess student satisfaction.
Reduce the weighting of the citations indicator and use more than one measure of citations to assess research quality (citations per paper), faculty quality (citations per faculty) and research impact (total citations). Use field normalisation but sparingly and sensibly and forget about that regional modification.
Drop the Industry Income: Innovation indicator. It is unfair to liberal arts colleges and private universities and too dependent on input from institutions. Think about using patents instead.
Income is an input. Do not use unless it is to assess the efficiency of universities in producing research or graduates.
Considering dropping the international students indicator or at least reducing its weighting. It is too dependent on geography and encourages all sorts of immigration scams.
Benchmark scores against the means of a constant number of institutions. If you do not, the mean indicator scores will fluctuate from year to year causing all sorts of distortions.
Times Higher Education (THE) have announced that they will be introducing reforms to their World University Rankings and ending their partnership with media and data giant, Thomson Reuters (TR).
Exactly why is not stated. It could be rooted in financial disagreement. Maybe THE feels betrayed because TR let US News use the reputation survey for their new Best Global Universities rankings. Perhaps THE got fed up with explaining why places like Bogazici University, Federico Santa Maria Technical University and Royal Holloway were world beaters for research impact, outshining Yale, Oxford and Cambridge,
The reputation survey will now be administered by THE itself in cooperation with Elsevier and will make use of the Scopus database. Institutional data will be collected from universities, the Scopus database and the Scival analysis tool by a new THE team.
The coming months will reveal what THE have in store but for now this is a list of recommendations. No doubt there will be many more from all sorts of people.
Display each indicator separately instead of lumping them together into Teaching, Research and International Outlook. It is impossible to work out exactly what is causing a rise or fall in the rankings unless they are separated.
Try to find some why of reducing the volatility of the reputation survey. US News do this by using a five year average and QS by rolling over unchanged responses for a further two years.
Consider including questions about undergraduate teaching or doing another survey to assess student satisfaction.
Reduce the weighting of the citations indicator and use more than one measure of citations to assess research quality (citations per paper), faculty quality (citations per faculty) and research impact (total citations). Use field normalisation but sparingly and sensibly and forget about that regional modification.
Drop the Industry Income: Innovation indicator. It is unfair to liberal arts colleges and private universities and too dependent on input from institutions. Think about using patents instead.
Income is an input. Do not use unless it is to assess the efficiency of universities in producing research or graduates.
Considering dropping the international students indicator or at least reducing its weighting. It is too dependent on geography and encourages all sorts of immigration scams.
Benchmark scores against the means of a constant number of institutions. If you do not, the mean indicator scores will fluctuate from year to year causing all sorts of distortions.
Thursday, November 06, 2014
The US News Arab Region Rankings
Hardly a week passes without the publication of yet more international university rankings. This week it was the Best Arab Region Universities from the US News, famous for producing America's Best Colleges for over three decades.
These rankings are research based. There are nine indicators, one of which measures the number of publications and has a weighting of 30 per cent. The other eight relate to citations in some way. There are no indicators measuring faculty student ratio, teaching quality, graduate employment, income or reputation.
Inclusion in the rankings required 400 papers in the Scopus database over a five year period,2009 to 2013. It is a serious indictment of Arab universities that only 91 institutions could reach this modest target.
There is an interesting section in the methodology:
The rankings show that research in the Arab world is dominated by a few countries. Just over half of the universities in the rankings come from three countries, Egypt, Saudi Arabia and Algeria. However, at the very top the rankings are dominated by Saudi Arabia, which holds the first three places.
The Top Ten are:
1. King Saud University, Saudi Arabia
2. King Abdualaziz University, Saudi Arabia
3. King Abdullah University of Science and Technology, Saudi Arabia
4. Cairo University, Egypt
5. American University of Beirut, Lebanon
6. Mansoura University, Egypt
7. Ain Shams University, Egypt
8 King Fahd University of Petroleum and Minerals, Saudi Arabia
9. Alexandria University, Egypt
10. United Arab Emirates University
There are also 16 subject rankings. Every one of these is topped by a Saudi university except for the Social Sciences which is headed by the American University of Beirut. In first place in the Physics rankings is King Abdulaziz University which has benefited from those multi-contributor publications which feature at least one of its adjunct faculty with a double affiliation.
Universite Cadi Ayyad Marrakech, Morocco, which was declared by THE to be the best Arab university and best in Africa north of the Kalahari, is in thirtieth place here. I wonder why.
These rankings are research based. There are nine indicators, one of which measures the number of publications and has a weighting of 30 per cent. The other eight relate to citations in some way. There are no indicators measuring faculty student ratio, teaching quality, graduate employment, income or reputation.
Inclusion in the rankings required 400 papers in the Scopus database over a five year period,2009 to 2013. It is a serious indictment of Arab universities that only 91 institutions could reach this modest target.
There is an interesting section in the methodology:
"Papers published by Arab region institutions in the subject area of physics and astronomy were excluded based on input from Elsevier's bibliometric experts, who determined that their citation characteristics would distort the results of the overall rankings. There is, however, a separate subject ranking for physics and astronomy that is based on papers published exclusively in those fields."
This presumably means that US News is aware of the distorting effect of physics publications with a large number of contributing authors, which has helped propel institutions such as Panjab University and Federico Santa Maris Technical University into high spots in the THE world rankings.
The rankings show that research in the Arab world is dominated by a few countries. Just over half of the universities in the rankings come from three countries, Egypt, Saudi Arabia and Algeria. However, at the very top the rankings are dominated by Saudi Arabia, which holds the first three places.
The Top Ten are:
1. King Saud University, Saudi Arabia
2. King Abdualaziz University, Saudi Arabia
3. King Abdullah University of Science and Technology, Saudi Arabia
4. Cairo University, Egypt
5. American University of Beirut, Lebanon
6. Mansoura University, Egypt
7. Ain Shams University, Egypt
8 King Fahd University of Petroleum and Minerals, Saudi Arabia
9. Alexandria University, Egypt
10. United Arab Emirates University
There are also 16 subject rankings. Every one of these is topped by a Saudi university except for the Social Sciences which is headed by the American University of Beirut. In first place in the Physics rankings is King Abdulaziz University which has benefited from those multi-contributor publications which feature at least one of its adjunct faculty with a double affiliation.
Universite Cadi Ayyad Marrakech, Morocco, which was declared by THE to be the best Arab university and best in Africa north of the Kalahari, is in thirtieth place here. I wonder why.
Tuesday, November 04, 2014
The Taiwan (NTU) Rankings
These rankings are entirely research based and make no attempt to measure teaching quality. They also have a very strong bias against the humanities and social sciences: the London School of Economics and the Stockholm School of Economics do not appear at all.
They tend to reward size rather than quality so that Johns Hopkins is in 2nd place and Caltech 36th. The emphasis on citations gives a boost to medical schools like the University of California San Francisco and Rockefeller University.
The results, with these limitations, are quite reasonable.
Publisher
National Taiwan University
Scope
Global. Data provided for 500 universities. 903 ranked.selected from Essential Science Indicators and other rankings.
Indicators
Number of articles 2003-13: 10%
Number of articles 2013: 15%
Number of citations 2003-13: 15%
Number of citations 2012-13:10%
Average number of citations 2003-13: 10%
h-index 2012-13: 10%
Number of highly cited papers 2003-13: 15%
Number of articles in high-impact journals 2012: 15%
Top Ten (total score)
Countries with Universities in the Top Hundred
Top Ranked in Region (Total Score)
Noise Index
In the top 20, the NTU rankings are more volatile than the THE world rankings but less so than QS, with the average university moving up or down 1.2 places since last year.
Looking at the top 100 universities, the NTU rankings are very volatile, since several of the indicators cover a one or two year period. The average university has changed 7.3 places over the year.
Note: universities falling out of the top 100 are treated as though they fell to 101st position.
They tend to reward size rather than quality so that Johns Hopkins is in 2nd place and Caltech 36th. The emphasis on citations gives a boost to medical schools like the University of California San Francisco and Rockefeller University.
The results, with these limitations, are quite reasonable.
Publisher
National Taiwan University
Scope
Global. Data provided for 500 universities. 903 ranked.selected from Essential Science Indicators and other rankings.
Indicators
Number of articles 2003-13: 10%
Number of articles 2013: 15%
Number of citations 2003-13: 15%
Number of citations 2012-13:10%
Average number of citations 2003-13: 10%
h-index 2012-13: 10%
Number of highly cited papers 2003-13: 15%
Number of articles in high-impact journals 2012: 15%
Top Ten (total score)
Place | University |
---|---|
1 | Harvard University |
2 | Johns Hopkins University |
3 | Stanford University |
4 | University of Toronto |
5 | University of Washington Seattle |
6 | University of California Los Angeles |
7 | University of Michigan Ann Arbor |
8= | University of California Berkeley |
8= | Massachusetts Institute of Technology |
8= | University of Oxford |
Countries with Universities in the Top Hundred
Country | Number of Universities |
---|---|
USA | 45 |
UK | 8 |
Netherlands | 7 |
Germany | 5 |
Canada | 5 |
Australia | 4 |
China | 4 |
Sweden | 3 |
Japan | 3 |
France | 2 |
Belgium | 2 |
Denmark | 2 |
Switzerland | 2 |
Singapore | 1 |
Spain | 1 |
Italy | 1 |
Finland | 1 |
Brazil | 1 |
Taiwan | 1 |
Norway | 1 |
South Korea | 1 |
Top Ranked in Region (Total Score)
North America
|
Harvard
|
---|---|
Africa | University of Cape Town |
Europe | Oxford University |
Latin America | Universidade de Sao Paulo |
Asia | University of Tokyo |
Central and Eastern Europe | Charles University in Prague |
Arab World | King Abdulaziz University |
Middle East | Tel Aviv University |
Oceania | University of Melbourne |
Noise Index
In the top 20, the NTU rankings are more volatile than the THE world rankings but less so than QS, with the average university moving up or down 1.2 places since last year.
Ranking | Average Place Change of Universities in the top 20 |
---|---|
NTU rankings 2013-14 | 1.20 |
THE World rankings 2013-14 | 0.70 |
QS World Rankings 2013-2014 | 1.45 |
ARWU 2013 -2014 | 0.65 |
Webometrics 2013-2014 | 4.25 |
Center for World University Ranking (Jeddah) 2013-2014 | 0.90 |
Looking at the top 100 universities, the NTU rankings are very volatile, since several of the indicators cover a one or two year period. The average university has changed 7.3 places over the year.
Ranking | Average Place Change of Universities in the top 100 |
---|---|
NTU rankings 2013-14 | 7.30 |
THE World Rankings 2013-2014 | 4.34 |
QS World Rankings 2013-14 | 3.94 |
QS World Rankings 2013-14 | 3.94 |
ARWU 2013 -2014 | 4.92 |
Webometrics 2013-2014 | 12.08 |
Center for World University Ranking (Jeddah) 2013-2014 | 10.59 |
Note: universities falling out of the top 100 are treated as though they fell to 101st position.
Friday, October 31, 2014
Initial Comments on the US News Global Rankings
It was a bit of a surprise when US News & World Report (USNWR) announced that they were going global but perhaps it shouldn't have been. The USNWR has been ranking American colleges since the early 80s, making even the Shanghai Centre for World Class Universities or QS look like novices. Also, with the advance of globalisation of higher education and research there is now a market for comparisons of US universities and their international competitors.
The Best Global Universities rankings are research based, except for two indicators, each with a 5% weighting, that count Ph D degrees. They are also heavily citation oriented, with a huge 42.5% weighting going to citations. However, the US News staff have used their common sense and included four measures of citations, normalized citation impact, total citations, number of highly cited papers and percentage of highly cited papers.
The result of this is that many of the high fliers in this year's THE rankings are absent. Bogazici University in Turkey, 14th best in Asia according to THE, is absent, So is Federico Santa Maria Technical University in Chile, according to THE second best in Latin America and Panjab University, supposedly the second best in India.
The reason for this contrast is simply that THE and Thomson Reuters rewarded these institutions for a few physics papers with hundreds of participating institutions by using a very inappropriate methodology and giving it a 30% weighting. USNWR have trimmed this indicator to 10% and so the high fliers have been grounded.
The Best Global Universities rankings are research based, except for two indicators, each with a 5% weighting, that count Ph D degrees. They are also heavily citation oriented, with a huge 42.5% weighting going to citations. However, the US News staff have used their common sense and included four measures of citations, normalized citation impact, total citations, number of highly cited papers and percentage of highly cited papers.
The result of this is that many of the high fliers in this year's THE rankings are absent. Bogazici University in Turkey, 14th best in Asia according to THE, is absent, So is Federico Santa Maria Technical University in Chile, according to THE second best in Latin America and Panjab University, supposedly the second best in India.
The reason for this contrast is simply that THE and Thomson Reuters rewarded these institutions for a few physics papers with hundreds of participating institutions by using a very inappropriate methodology and giving it a 30% weighting. USNWR have trimmed this indicator to 10% and so the high fliers have been grounded.
Friday, October 17, 2014
The university rankings business gets bigger and bigger
US News is going global. There are three different Arab/ MENA rankings on the way. Now, QS is getting ready for further growth. This is from Education Investor.
Posted on: 16/10/2014
Exclusive: QS seeks £10m investment
The university rankings provider QS is looking to sell a £10 million stake in its business, EducationInvestorunderstands.
According to its website, QS runs websites and events that connect graduates and employers. But it is best known for its World University Rankings, which it claims are “the most widely read university comparison of their kind”.
Three sources close to the matter said a deal was on the table, and one said that first round bids had already been submitted. QS wants to raise the cash “half to buy out an existing shareholding and half to use as growth capital”.
However, Nunzio Quacquatelli, managing director and majority shareholder of QS, told EducationInvestorthat the firm was “looking at all options, both debt and possibly structured finance”.
“We are looking for some external funding to support our rapid growth. Our vision is to be a leading information company in the higher education sector with global ambitions and [with this funding] we aim to continue on this path.”
QS operates in over 70 countries, and has more than 200 staff and 1,200 clients. Its valuation hasn’t been publicised, but the firm is understood to have an ebitda of £3.3 million and revenue of £19.8 million.
According to one source, the deal is expected to complete later in the fourth quarter.
|
Posted on: 16/10/2014
Subscribe to:
Posts (Atom)