Expectation
David Willetts, the Brish minister for universities and science says that he expects that more British universities will be in the Times Higher Education World University Rankings top 200.
And if more British universities, then fewer........?
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Tuesday, October 04, 2011
The US News rankings
The U.S. News rankings of American colleges and universities were released on September 13th. For more information go here.
The top 10 national unuiversities are:
1. Harvard
2. Princeton
3. Yale
4. Columbia
5 = Caltech
5 = MIT
5= Stanford
5= Chicago
5= University of Pennsylvania
10. Duke
The U.S. News rankings of American colleges and universities were released on September 13th. For more information go here.
The top 10 national unuiversities are:
1. Harvard
2. Princeton
3. Yale
4. Columbia
5 = Caltech
5 = MIT
5= Stanford
5= Chicago
5= University of Pennsylvania
10. Duke
Tuesday, September 13, 2011
Announcement from THE
Times Higher Education have just announced that they will only rank 200 universities this year. Another 200 will be listed alphabetically but not ranked.
Let us be clear: the Times Higher Education World University Rankings list only the world’s top 200 research-led global universities.
We stop our annual list at the 200th place for two reasons. First, it helps us to make sure that we compare like with like. Although those ranked have different histories, cultures, structures and sizes, they all share some common characteristics: they recruit from the same global pool of students and staff; they push the boundaries of knowledge with research published in the world’s leading journals; and they teach at both the undergraduate and doctoral level in a research-led environment.
We unashamedly rank only around 1 per cent of the world’s universities – all of a similar type – because we recognise that the sector’s diversity is one of its great strengths, and not every university should aspire to be one of the global research elite.
If THE are going to provide sufficient detail about the component indicators to enable analysts to work out how universities compare with each other this would be be a good idea. It would avoid raucous demands that university heads resign whenever the top national university slips 20 places in the rankings but would allow analysts to figure out exactly where schools were standing.But we also stop the ranking list at 200 in the interests of fairness. It is clear that the lower down the tables you go, the more the data bunch up and the less meaningful the differentials between institutions become. The difference between the institutions in the 10th and 20th places, for example, is much greater than the difference between number 310 and number 320. In fact, ranking differentials at this level become almost meaningless, which is why we limit it to 200.
It is true, as Phil Baty says, that there is not much difference between being 310 and 320 but there is, or there would be if the methodology was valid, a difference between 310 and 210. If THE are just going to present us with a list of 200 universities that did not (quite?) make it into the top 200 a lot of usable information will be lost.
The argument that THE is interested only in the ranking of the leading research led institutions seems to run counter to THE's emphasis on its bundle of teaching indicators and the claim that normalization of citations data can uncover hidden pockets of excellence. If we are concerned only with universities with a research led environment then a few pockets or even a single pocket should be of little concern.
One also wonders what would happen if disgruntled universities decided that it was not worth the effort of collecting masses of data for TR and THE if the only reward is to be lumped among 200 also rans.
Saturday, September 10, 2011
QS: The Employer Survey
The employer survey indicator in the QS World University Rankings might be regarded as a valuable assessment tool since it provides an external check on university quality. There are, however, some odd things about this indicator in the 2011 QS Rankings.
Thirteen universities are given scores of 100, of which 10 are listed as in 4th= place, presumably meaning that they had scores that were identical down to the first or second decimal point. Then 15 schools are listed as being in 15th place with a score of 90, 48 in 51st place with a score of 59.4 and 52 in 100th= place with a score of 55.9.
This is probably something to do with a massive upsurge in responses from Latin America, although exactly what is not clear. QS report that:
"QS received a dramatic level of response from Latin America in 2011, these counts and all subsequent analysis have been adjusted by applying a weighting to responses from countries with a distinctly disproportionate level of response."
Baloney
The economist David Blanchflower has dismissed the QS rankings as "a load of old baloney".
Much of what he says is sensible, indeed obvious. But not entirely.
"This ranking is complete rubbish and nobody should place any credence in it."
A bit too strong. The QS rankings are not too bad in parts, having improved over the last few years, and are moderately accurate about sorting out universities within a country or region. I doubt that anyone seriously thinks that Cambridge is the best university in the world unless we start counting May balls and punting on the Cam but it is quite reasonable that it is better than Oxford or Durham. Similarly, I wonder if anyone could argue that it is rubbish that Tokyo is the best university in Japan or Cape Town in Africa.
"It is unclear whether having more foreign students and faculty should even have a positive rank; less is probably better."
Students yes, but if nothing else more international faculty does mean that a university is recruiting from a larger pool of talent.
Blanchflower does not mention the academic and employer surveys both of which are flawed but do provide another dimension of assessment or the faculty student ratio which is very crude but might have a slightly closer relationship to teaching quality than the number of alumni who received Nobel prizes decades ago.
He then goes on to compare the QS rankings unfavorably with the Shanghai rankings (That actually is Shanghai Jiao Tong University not what he calls the University of Shanghai). I would certainly agree with most of what he says here but I think that we should remember that flawed as they are the QS rankings do, unlike the Shanghai index, give some recognition to excellence in the arts and humanities, make some attempt to assess teaching and provide a basis for discriminating among those universities without Nobel prize winners or Fields medalists.
Finally, I would love to see if Blanchflower has any comments on last year's THE-Thomson Reuters rankings which put Alexandria, Bilkent and Hong Kong Baptist University among the world's research superpowers.
Friday, September 09, 2011
Well Done, QS
QS have just indicated that they have excluded self-citations from their citations per faculty indicator in this year's World University Rankings. This is a very positive move that will remove some of the distortions that have crept into this indicator over the last few years. It would have been even better if they had excluded citations within journals and within institutions. Maybe next year.
It will be interesting to see if Times Higher Education and Thomson Reuters do the same with their rankings in October.It would not be very difficult and it might help to exclude Alexandria University and a few others from an undeserved place in the world's top universities for research impact.
(By the way Karolinska Institute is not in the US)
Although it may not make very much difference at the very top of this indicator, it seems that some places have suffered severely and others have benefited from the change. According to the QS intelligence unit:
- Of all of the institutions we looked at the institution with the largest absolute number of self-citations, by some margin, is Harvard with over 93,000 representing 12.9% of their overall citations count
- The top five institutions producing over 3,000 papers, in terms of proportion of self-citations are all in Eastern Europe – St Petersburg State University, Czech Technical University, Warsaw University of Technology, Babes-Bolyai University and Lomonosov Moscow State University
- The top five in terms of the difference in citations per paper when self-citations are excluded are Caltech, Rockefeller, UC Santa Cruz, ENS Lyon and the University of Hawaii
- And the top 10 in terms of the difference in citations per faculty when self-citations are included are:
# | Institution | Country |
---|---|---|
1 | California Institute of Technology (Caltech) | United States |
2 | Rockefeller University | United States |
3 | Stanford University | United States |
4 | Gwangju Institute of Science and Technology (GIST) | South Korea |
5 | Karolinska Institute | United States |
6 | Princeton University | United States |
7 | Leiden University | Netherlands |
8 | Harvard University | United States |
9 | University of California, San Diego (UCSD) | United States |
10 | University of California, San Francisco (UCSF) | United States |
Tuesday, September 06, 2011
The Best University in the World
Update 8/9/2011 -- some comments added
For many people the most interesting thing about the QS rankings is the battle for the top place. The Shanghai rankings put Harvard in first place year after year and no doubt will do so for the next few decades. QS when it was in partnership with Times Higher Education also routinely put Harvard first. This is scarcely surprising since the research prowess of Cambridge has steadily declined in recent years. Still, Cambridge, Oxford and two London colleges did quite well mainly because they got high scores for international faculty and students and for the academic survey (not surprising since a disproportionate number of responses came from the UK, Australia and New Zealand) but not well enough to get over their not very distinguished research record.
Last year, however, Cambridge squeezed past Harvard. This was not because of the academic and employer surveys. That remained at 100 for both places. What happened was that between 2009 and 2010 Cambridge's score for citations per faculty increased from 89 to 93. This would be a fine achievement if it represented a real improvement. Unfortunately, almost every university with scores above 60 for this indicator in 2009 went up by a similar margin in 2010 while universities with scores below 50 slumped. Evidently, there was a new method of converting raw scores. Perhaps a mathematician out there can help.
And this year?
Cambridge and Harvard are both at 100 for the academic and employer surveys just like last year. (Note that although Harvard does better than Cambridge in both surveys they get the same reported score of 100).
For the faculty student ratio Harvard narrowed the gap a little from 3 to 2.5 points. In citations per faculty Cambridge slipped a bit by 0.3 points. However, Cambridge pulled further ahead on international students and faculty.
Basically, from 2004 to 2009 Harvard reigned supreme because its obvious superiority in research was more than enough to offset the advantages Cambridge enjoyed with regard to internationalisation (small country and policies favouring international students), faculty student ratio (counting non-teaching research staff) and the academic survey (disproportionate responses from the UK and Commonwealth). But this year and last the change in the method of converting the raw scores for citations per faculty artificially boosted Cambridge's overall scores.
So, is Cambridge really the world's top university?
Update 8/9/2011 -- some comments added
For many people the most interesting thing about the QS rankings is the battle for the top place. The Shanghai rankings put Harvard in first place year after year and no doubt will do so for the next few decades. QS when it was in partnership with Times Higher Education also routinely put Harvard first. This is scarcely surprising since the research prowess of Cambridge has steadily declined in recent years. Still, Cambridge, Oxford and two London colleges did quite well mainly because they got high scores for international faculty and students and for the academic survey (not surprising since a disproportionate number of responses came from the UK, Australia and New Zealand) but not well enough to get over their not very distinguished research record.
Last year, however, Cambridge squeezed past Harvard. This was not because of the academic and employer surveys. That remained at 100 for both places. What happened was that between 2009 and 2010 Cambridge's score for citations per faculty increased from 89 to 93. This would be a fine achievement if it represented a real improvement. Unfortunately, almost every university with scores above 60 for this indicator in 2009 went up by a similar margin in 2010 while universities with scores below 50 slumped. Evidently, there was a new method of converting raw scores. Perhaps a mathematician out there can help.
And this year?
Cambridge and Harvard are both at 100 for the academic and employer surveys just like last year. (Note that although Harvard does better than Cambridge in both surveys they get the same reported score of 100).
For the faculty student ratio Harvard narrowed the gap a little from 3 to 2.5 points. In citations per faculty Cambridge slipped a bit by 0.3 points. However, Cambridge pulled further ahead on international students and faculty.
Basically, from 2004 to 2009 Harvard reigned supreme because its obvious superiority in research was more than enough to offset the advantages Cambridge enjoyed with regard to internationalisation (small country and policies favouring international students), faculty student ratio (counting non-teaching research staff) and the academic survey (disproportionate responses from the UK and Commonwealth). But this year and last the change in the method of converting the raw scores for citations per faculty artificially boosted Cambridge's overall scores.
So, is Cambridge really the world's top university?
Monday, September 05, 2011
The THE-TR Rankings
The THE-TR World University Rankings will be published on October 6th.
There will be some changes. The weighting given to the citations indicator will be slightly reduced to 30% and internationalisation gets 7.5% instead of 5%.
There will be some tweaking of the citations indicator to avoid a repeat of the Alexandria and other anomalies. Let's hope it works.
In the research indicator there will be a reduction in the weighting given to the survey and public research income as a percentage of research income will be removed.
There will, unfortunately, be a slight increase in the percentage given of international students and a decline in that for international faculty.
Commentary on the 2011 QS World University Rankings
From India
"University of Cambridge retains its number one spot ahead of Harvard, according to the QS World University Rankings 2011, released today. Meanwhile, MIT jumps to the third position, ahead of Yale and Oxford.
While the US continues to dominate the world ranking scenario, taking 13 of top 20 and 70 of top 300 places, 14 of 19 Canadian universities have ranked lower than 2010. As far as Europe is concerned, Germany, one of the emerging European destinations in recent times, has no university making it to the top 50 despite its Excellence Initiative.
Asian institutions - particularly those from Japan, Korea, Singapore, Hong Kong and China - have fared well at a discipline level in subject rankings produced by QS this year - this is particularly true in technical and hard science fields.
Despite the Indian government's efforts to bring about a radical change in the Indian higher education sector, no Indian university has made it to the top 200 this year. However, China has made it to the top 50 and Middle East in the top 200 for the first time.
According to Ben Sowter, QS head of research, "There has been no (relative) improvement from any Indian institution this year. The international higher education scene is alive with innovation and change, institutions are reforming, adapting and revolutionising. Migration amongst international students and faculty continues to grow with little sign of slowing. Universities can no longer do the same things they have always done and expect to maintain their position in a ranking or relative performance.""
Commentary on 2011 QS World University Rankings
SEÁN FLYNN, Education EditorTCD AND UCD have continued to slide down the world university rankings in a trend which will concern Government, business and heads of colleges.
The latest QS rankings – published this morning – show a substantial drop in ranking for most Irish universities.
TCD drops down 13 places to 65; UCD is down 20 places from 114 to 134. NUI Galway suffers the most dramatic fall, down 66 places to 298. UCC bucked the trend, up marginally from 184 to 181.
The new international league table is a serious blow to the Irish university sector. Two years ago TCD was in the elite top 50 colleges, while UCD was in the top 100. Over the past two years both of Ireland’s leading colleges have lost significant ground.
The fall in Irish rankings was widely expected as the university sector has struggled to cope with a 6 per cent decline in employment and a funding crisis.
Commentary on the 2011 QS World University Rankings
"In tough times, good news comes for Australian institutions in Eighth QS World University Rankings®- Eighth annual QS World University Rankings® sees all of the Group of Eight featured in the top 300
- Australian National University (26) remains Australia’s best-performing university but falls by 6 places.
- Seventeen Australian institutions featured in the top 300
- Based on six indicators including surveys of over 33,000 global academics and 16,000 graduate employers, the largest of their kind ever conducted
- New in 2011: results published alongside comparative international tuition fee on www.topuniversities.com"
Commentary on the 2011 QS World University Rankings
"PETALING JAYA: Universiti Malaya (UM) is the only Malaysian institution that has made it to the top 200 of the QS World University Rankings 2011/12.
It moved up 40 places to 167 this year compared to 207 in 2010.
Universiti Kebangsaan Malaysia (UKM), Universiti Sains Malaysia (USM), Universiti Putra Malaysia (UPM) and Universiti Teknologi Malaysia (UTM) have all slid down the rankings (see table).
UKM is ranked 279 this year compared to 263 in 2010; USM at 335 (309), UPM 358 (319) and UTM at between 401 and 450 (365).
...
For the first time, the International Islamic University Malaysia (IIUM) and Universiti Teknologi Mara (UiTM) were included in the rankings at 451-500 and 601+ respectively."
Commentary on the QS 2011 World University Rankings
"Dubai: UAE University (UAEU) has moved up 34 places to come 338th in the Quacquarelli Symonds (QS) World University Rankings, which looked at more than 2,000 institutions to come up with a top 500 list.
UAEU officials said the university is working toward a top 100 spot. The university was also ranked 299th in the Life Sciences & Medicine subject category.The University of Cambridge was ranked as the top university in the world followed by Harvard University, Massachusetts Institute of Technology (MIT), Yale University, University of Oxford.
Saudi Arabia's King Saud University (KSU) came 200th and tops the list among Middle East institutions with King Fahd University of Petroleum & Minerals (KFUPM) and King Abdul Aziz University (KAU) coming in second and fifth respectively. American University of Beirut came third and UAEU fourth."
Sunday, September 04, 2011
QS Rankings Published
The rankings have now been published and can be accessed here.
The top 300 are included with total scores and tuition fees.
QS Rankings Update
Some highlights are provided by CNW
Highlights:
- Global: University of Cambridge retains number one spot ahead of Harvard, while MIT jumps to third ahead of Yale and Oxford; 38 countries in top 300
- Government and private funding for technology-focussed research is eroding the dominance of traditional comprehensive universities. The average age of the top 100 institutions has dropped by seven years since 2010, reflecting the emergence of newer specialist institutions particularly in Asia
- US/Canada: US takes 13 of top 20 and 70 of top 300 places; McGill (17) and Toronto (23) both up, but 14 of 19 Canadian universities rank lower than 2010
- UK/Ireland: Oxford (5) and Imperial (6) leapfrog UCL (7), as four UK universities make the top 10; TCD (65) and UCD (134) both drop
- Continental Europe: ETH Zurich (18) leads ENS Paris (33), EPFL (35) and ParisTech (36); no German university in top 50 despite Excellence Initiative
- Asia: HKU (22) leads Tokyo (25), NUS (28) and Kyoto (32); India: IITB drops out of top 200; China: Tsinghua (47) joins Peking (46) in top 50
- Australia: Gap between ANU (26) and Melbourne (31) closes from 18 to five, ahead of Sydney (38); G8 all make top 100
- Middle East: King Saud University (200) makes top 200 for first time
- Latin America: USP (169) makes top 200 for first time; five universities in top 300 (Brazil, Chile and Argentina)
QS Rankings Update
The BBC reports on Scottish universities in the rankings.
The University of Glasgow has climbed 18 places in an international league table of higher education institutions.
Glasgow is now 59th in the QS World University Rankings, ahead of St Andrews which is in 97th place.The University of Edinburgh is the highest ranked Scottish institution moving up two places to 20th position.
Principal of Glasgow, Professor Anton Muscatelli, said it had confirmed its position as one of the world's leading universities.
QS Rankings 2011 Update
According to the Herald Sun, Cambridge has retained its place at the top of the QS rankings.
MELBOURNE is clawing its way up the ranks of the world's best universities, but Canberra is clinging on to top spot.
Australian National University is the nation's best tertiary institution, claiming 26th spot in the international league table.
But Melbourne University is hot on its heels - ranked 31st after jumping seven spots over the past year.
The UK's famed Cambridge University has claimed poll position, followed by Harvard University, Massachusetts Institute of Technology and Yale University in the US.
Oxford University rounded out the top five, according to the QS World University Rankings, released yesterday.
Victoria's other top performers were Monash University, which jumped one spot to 60, and RMIT at 228.
QS Rankings Update
Although they have not been released yet some news about the QS 2011 rankings is trickling out.
From Todayonline in Singapore
Although they have not been released yet some news about the QS 2011 rankings is trickling out.
From Todayonline in Singapore
NTU [Nanyang Technological University] , NUS [National University of Singapore] climb the ladder in global university rankings
NTU leaps 16 places to take 58th spot, while NUS moves up three notches to take 28th spot
Saturday, September 03, 2011
The QS Rankings are Coming
QS will release their 2011 World University Rankings at 0101 GMT on Monday.They have already sent out fact files to the 600+ listed universities.
Things to look for.
Will Harvard regain it's position at the top from Cambridge? It might if QS revert to the previous method of converting the raw scores for the citations per faculty indicator.
Will spending cuts lead to a decline in the observed quality of British universities?
Will universities in China,Korea, Latin America and the Middle East repeat the successes they recorded in the Shanghai rankings?
Will Universiti Malaya return to the top 200? If it does will it be acknowledged as the number 1 Malaysian University?
Watch this space
QS will release their 2011 World University Rankings at 0101 GMT on Monday.They have already sent out fact files to the 600+ listed universities.
Things to look for.
Will Harvard regain it's position at the top from Cambridge? It might if QS revert to the previous method of converting the raw scores for the citations per faculty indicator.
Will spending cuts lead to a decline in the observed quality of British universities?
Will universities in China,Korea, Latin America and the Middle East repeat the successes they recorded in the Shanghai rankings?
Will Universiti Malaya return to the top 200? If it does will it be acknowledged as the number 1 Malaysian University?
Watch this space
Monday, August 29, 2011
Japanese Universities Send a Strong Request
A very interesting document from the top 11 Japanese research universities has appeared. They are unhappy with the citations indicator in last year's Times Higher Education -- Thomson Reuters World University Rankings.
It is a tactical mistake to go on about uniqueness. This is an excuse that has been used too often by institutions whose flaws have been revealed by international rankings.
Still, they do have a point. They go on to show that when the position of Asian universities according to the citations indicator in the THE-TR rankings is compared with the citations per paper indicator in the 2010 QS Asian university rankings, citations per paper over an 11 year period from TR's Essential Science Indicators and citations per paper/citations per faculty in the 2010 QS World university rankings (I assume they mean citations per faculty here since the QS World University Rankings do not have a citations per paper indicator) leading Japanese universities do badly while Chinese, Korean and other Asian universities do very well.
They complain that the THE--TR rankings emphasise "home run papers" and research that produces immediate results and that regional modification (normalisation) discriminates against Japanese universities.
This no doubt is a large part of the story but I suspect that the distortions of the 2010 THE--TR indicator are also because differences in the practice of self citation and intra--university citation, because TR's methodology actually favors those who publish relatively few papers and because of its bias towards low--cited disciplines.
The document continues:
I suspect that TR and THE would reply that their methodology identifies pockets of excellence (which for some reason cannot be found anywhere in the Japanese RU 11), that the RU 11 are just poor losers and that they are right and QS is wrong.
This question might be resolved by looking at other measures of citations such as those produced by HEEACT, Scimago and ARWU.
It could be that this complaint if was sent to TR was the reason for TR and THE announcing that they were changing the regional weighting process this year. If that turns out to be the case and TR is perceived as changing its methodology to suit powerful vested interests then we can expect many academic eyebrows to be raised.
If the RU 11 are still unhappy then THE and TR might see a repeat of the demise of the Asiaweek rankings brought on in part because of a mass abstention by Japanese and other universities.
A very interesting document from the top 11 Japanese research universities has appeared. They are unhappy with the citations indicator in last year's Times Higher Education -- Thomson Reuters World University Rankings.
"The purpose of analyzing academic research data, particularly publication and citation trends is to provide diverse objective information on universities and other academic institutions that can be used by researchers and institutions for various evaluations and the setting of objectives. The 2010 Thomson Reuters / THE World University Rankings, however, do not give sufficient consideration to the unique characteristics of universities in different countries or the differing research needs and demands from society based on country, culture and academic field. As a result, those rankings are likely to lead to an unbalanced misleading and misuse of the citation index.
RU11 strongly requests, therefore, that Thomson Reuters / THE endeavors to contribute to academic society by providing objective and impartial data, rather than imposing a simplistic and trivialized form of university assessment."
It is a tactical mistake to go on about uniqueness. This is an excuse that has been used too often by institutions whose flaws have been revealed by international rankings.
Still, they do have a point. They go on to show that when the position of Asian universities according to the citations indicator in the THE-TR rankings is compared with the citations per paper indicator in the 2010 QS Asian university rankings, citations per paper over an 11 year period from TR's Essential Science Indicators and citations per paper/citations per faculty in the 2010 QS World university rankings (I assume they mean citations per faculty here since the QS World University Rankings do not have a citations per paper indicator) leading Japanese universities do badly while Chinese, Korean and other Asian universities do very well.
They complain that the THE--TR rankings emphasise "home run papers" and research that produces immediate results and that regional modification (normalisation) discriminates against Japanese universities.
This no doubt is a large part of the story but I suspect that the distortions of the 2010 THE--TR indicator are also because differences in the practice of self citation and intra--university citation, because TR's methodology actually favors those who publish relatively few papers and because of its bias towards low--cited disciplines.
The document continues:
"1. The ranking of citations based on either citations per author (or faculty) or citations per paper represent two fundamentally different ways of thinking with regards to academic institutions: are the institutions to be viewed as an aggregation of their researchers, or as an aggregation of the papers they have produced? We believe that the correct approach is to base the citations ranking on citations per faculty as has been the practice in the past.
2. We request a revision of the method used for regional modification.
3. We request the disclosure of the raw numerical data used to calculate the citation impact score for the various research fields at each university."
I suspect that TR and THE would reply that their methodology identifies pockets of excellence (which for some reason cannot be found anywhere in the Japanese RU 11), that the RU 11 are just poor losers and that they are right and QS is wrong.
This question might be resolved by looking at other measures of citations such as those produced by HEEACT, Scimago and ARWU.
It could be that this complaint if was sent to TR was the reason for TR and THE announcing that they were changing the regional weighting process this year. If that turns out to be the case and TR is perceived as changing its methodology to suit powerful vested interests then we can expect many academic eyebrows to be raised.
If the RU 11 are still unhappy then THE and TR might see a repeat of the demise of the Asiaweek rankings brought on in part because of a mass abstention by Japanese and other universities.
Saturday, August 27, 2011
The THE Citations Indicator
The Research Impact indicator in last year's Times Higher Education - Thomson Reuters World University Rankings led to much condemnation and not a little derision. Alexandria University was fourth in the world for research impact, with Bilkent, Turkey, Hong Kong Baptist University and several other relatively obscure institutions achieving remarkably high scores.
The villain here was Thomson Reuters' field and year normalisation system by which citations were compared with world benchmarks for field and year. This meant that a large number of citations within year of publication to a paper classified as being in a low cited field could have a disproportionate effect, which might be further enhanced if the university was in a region where citations were low.
Now THE have announced that this year there will be three changes. These are:
So, everybody will have to wait until September to see what will happen.
The Research Impact indicator in last year's Times Higher Education - Thomson Reuters World University Rankings led to much condemnation and not a little derision. Alexandria University was fourth in the world for research impact, with Bilkent, Turkey, Hong Kong Baptist University and several other relatively obscure institutions achieving remarkably high scores.
The villain here was Thomson Reuters' field and year normalisation system by which citations were compared with world benchmarks for field and year. This meant that a large number of citations within year of publication to a paper classified as being in a low cited field could have a disproportionate effect, which might be further enhanced if the university was in a region where citations were low.
Now THE have announced that this year there will be three changes. These are:
- raising the threshold for inclusion in the citations indicator from 50 publications per year to 200
- Extending the period for counting citations from five to six years
- Changing regional normalisation so that it takes account of subject variations within regions as well as the overall level of citations.
- reducing the weighting given to citations
- not counting sell-citations, citations within institutions or citations within journals
- using a variety of indications to assess research impact, such as h-index, total citations, citations per paper
- using a variety of databases
So, everybody will have to wait until September to see what will happen.
Sunday, August 21, 2011
Value for Money
Recently, the Higher Education Funding Council of England published data indicating the percentage of UK students at English universities with grades AAB at A level. Oxford and Cambridge were at the top with 99% and Wolverhampton, Staffordshire and Hertfordshire at the bottom with 2%.
Now Hertfordshire statisticians have produced graphs comparing performance on four British league tables with tuition fees. Hertfordshire offers best value for tuition money in its band. Oxford, Cambridge, LSE, Derby and London Metropolitan do well in theirs. Liverpool John Moores, East London and Bedfordshire are among the worst.
It should be noted that at the moment the differences between tuition levels are relatively small so this table may not mean very much.
Recently, the Higher Education Funding Council of England published data indicating the percentage of UK students at English universities with grades AAB at A level. Oxford and Cambridge were at the top with 99% and Wolverhampton, Staffordshire and Hertfordshire at the bottom with 2%.
Now Hertfordshire statisticians have produced graphs comparing performance on four British league tables with tuition fees. Hertfordshire offers best value for tuition money in its band. Oxford, Cambridge, LSE, Derby and London Metropolitan do well in theirs. Liverpool John Moores, East London and Bedfordshire are among the worst.
It should be noted that at the moment the differences between tuition levels are relatively small so this table may not mean very much.
Saturday, August 20, 2011
Perhaps They know Something You Don't
The Pew Research Center has issued a report showing that women are more likely than men to see the value of a college education. Men, says the report, are laggards. The implication is that women are more perceptive than men.
An article in Portfolio.com reviewing the report refers to another study from the Brookings Institute that finds that college is in fact an excellent investment .
Why then, are men apparently so uninformed about the benefits of higher education? The Pew report provides part of the answer when it discloses that men are much more likely than women to pay for college by themselves. A good investment, it seems, is even better when it is paid for by somebody else.
Also, let us compare the career prospects of men and women with average degrees in the humanities or social sciences. Even without affirmative action, men who are bored by diversity training, professional development, all sorts of sensitisation and other rituals of the feminised corporation and bureaucracy are unlike to get very far if anywhere.
And perhaps men are more likely to grow by themselves.
The Pew Research Center has issued a report showing that women are more likely than men to see the value of a college education. Men, says the report, are laggards. The implication is that women are more perceptive than men.
At a time when women surpass men by record numbers in college enrollment and completion, they also have a more positive view than men about the value higher education provides, according to a nationwide Pew Research Center survey. Half of all women who have graduated from a four-year college give the U.S. higher education system excellent or good marks for the value it provides given the money spent by students and their families; only 37% of male graduates agree. In addition, women who have graduated from college are more likely than men to say their education helped them to grow both personally and intellectually.
An article in Portfolio.com reviewing the report refers to another study from the Brookings Institute that finds that college is in fact an excellent investment .
Why then, are men apparently so uninformed about the benefits of higher education? The Pew report provides part of the answer when it discloses that men are much more likely than women to pay for college by themselves. A good investment, it seems, is even better when it is paid for by somebody else.
Also, let us compare the career prospects of men and women with average degrees in the humanities or social sciences. Even without affirmative action, men who are bored by diversity training, professional development, all sorts of sensitisation and other rituals of the feminised corporation and bureaucracy are unlike to get very far if anywhere.
And perhaps men are more likely to grow by themselves.
Subscribe to:
Posts (Atom)