Monday, August 29, 2011

Japanese Universities Send a Strong Request

A very interesting document from the top 11 Japanese research universities has appeared. They are unhappy with the citations indicator in last year's Times Higher Education -- Thomson Reuters World University Rankings.






"The purpose of analyzing academic research data, particularly publication and citation trends is to provide diverse objective information on universities and other academic institutions that can be used by researchers and institutions for various evaluations and the setting of objectives. The 2010 Thomson Reuters / THE World University Rankings, however, do not give sufficient consideration to the unique characteristics of universities in different countries or the differing research needs and demands from society based on country, culture and academic field. As a result, those rankings are likely to lead to an unbalanced misleading and misuse of the citation index.

RU11 strongly requests, therefore, that Thomson Reuters / THE endeavors to contribute to academic society by providing objective and impartial data, rather than imposing a simplistic and trivialized form of university assessment."


It is a tactical mistake to go on about uniqueness. This is an excuse that has been used too often by institutions whose flaws have been revealed by international rankings.

Still, they do have a point. They go on to show that when the position of Asian universities according to the citations indicator in the THE-TR rankings is compared with the citations per paper indicator in the 2010 QS Asian university rankings, citations per paper over an 11 year period from TR's Essential Science Indicators and citations per paper/citations per faculty in the 2010 QS World university rankings (I assume they mean citations per faculty here since the QS World University Rankings do not have a citations per paper indicator) leading Japanese universities do badly while Chinese, Korean and other Asian universities do very well.

They complain that the THE--TR rankings emphasise "home run papers" and research that produces immediate results and that regional modification (normalisation) discriminates against Japanese universities.

This no doubt is a large part of the story but I suspect that the distortions of the 2010 THE--TR indicator are also because differences in the practice of self citation and intra--university citation, because TR's methodology actually favors those who publish relatively few papers and because of its bias towards low--cited disciplines.

The document continues:



"1. The ranking of citations based on either citations per author (or faculty) or citations per paper represent two fundamentally different ways of thinking with regards to academic institutions: are the institutions to be viewed as an aggregation of their researchers, or as an aggregation of the papers they have produced?  We believe that the correct approach is to base the citations ranking on citations per faculty as has been the practice in the past.

2. We request a revision of the method used for regional modification. 

3. We request the disclosure of the raw numerical data used to calculate the citation impact score for the various research fields at each university."

I suspect that TR and THE would reply that their methodology identifies pockets of excellence (which for some reason cannot be found anywhere in the Japanese RU 11), that the RU 11 are just poor losers and that they are right and QS is wrong.

This question might be resolved by looking at other measures of citations such as those produced by HEEACT, Scimago and ARWU.

It could be that this complaint if was sent to TR was the reason for TR and THE announcing that they were changing the regional weighting process this year. If that turns out to be the case and TR is perceived as changing its methodology to suit powerful vested interests then we can expect many academic eyebrows to be raised.

If the RU 11 are still unhappy then THE and TR might see a repeat of the demise of the Asiaweek rankings brought on in part because of a mass abstention by Japanese and other universities.

Saturday, August 27, 2011

The THE Citations Indicator

The Research Impact indicator in last year's Times Higher Education - Thomson Reuters World University Rankings led to much condemnation and not a little derision. Alexandria University was fourth in the world for research impact, with Bilkent, Turkey, Hong Kong Baptist University and several other relatively obscure institutions achieving remarkably high scores.

The villain here was Thomson Reuters' field and year normalisation system by which citations were compared with world benchmarks for field and year. This meant that a large number of citations within year of publication  to a paper classified as being in a low cited field could have a disproportionate  effect, which might be further enhanced if the university was in a region where citations were low.

Now THE have announced that this year there will be three changes. These are:

  • raising the threshold for inclusion in the citations indicator from 50 publications per year to 200
  • Extending the period for counting citations from five to six years
  • Changing regional normalisation so that it takes account of subject variations within regions as well as the overall level  of citations.
Here are some things which Thomson Reuters apparently will not do:

  • reducing the weighting given to citations
  • not counting sell-citations, citations within institutions or citations within journals
  • using a variety of indications to assess research impact, such as h-index, total citations, citations per paper
  • using a variety of databases

So, everybody will have to wait until September to see what will happen.

Sunday, August 21, 2011

Value for Money

Recently, the Higher Education Funding Council of England published data indicating the percentage of UK students at English universities with grades AAB at A level. Oxford and Cambridge were at the top with 99% and Wolverhampton, Staffordshire and Hertfordshire at the bottom with 2%.

Now Hertfordshire statisticians have produced graphs comparing performance on four British league tables with tuition fees. Hertfordshire offers best value for tuition money in its band. Oxford, Cambridge, LSE, Derby and London Metropolitan do well in theirs. Liverpool John Moores, East London and Bedfordshire are among the worst.

It should be noted that at the moment the differences between tuition levels are relatively small so this table may not mean very much.

Saturday, August 20, 2011

Perhaps They know Something You Don't

The Pew Research Center has issued a report showing that women are more likely than men to see the value of a college education. Men, says the report, are laggards. The implication is that women are more perceptive than men.


At a time when women surpass men by record numbers in college enrollment and completion, they also have a more positive view than men about the value higher education provides, according to a nationwide Pew Research Center survey. Half of all women who have graduated from a four-year college give the U.S. higher education system excellent or good marks for the value it provides given the money spent by students and their families; only 37% of male graduates agree. In addition, women who have graduated from college are more likely than men to say their education helped them to grow both personally and intellectually.


An article in Portfolio.com reviewing the report refers to another study from the Brookings Institute that finds that college is in fact an excellent investment .

Why then, are men apparently so uninformed about the benefits of higher education? The Pew report provides part of the answer when it discloses that men are much more likely than women to pay for college by themselves. A good investment, it seems, is even better when it is paid for by somebody else.

Also, let us compare the career prospects of men and women with average degrees in the humanities or social sciences. Even without affirmative action, men who are bored by diversity training, professional development, all sorts of sensitisation and other rituals of the feminised corporation and bureaucracy are unlike to get very far if anywhere.

And perhaps men are more likely to grow by themselves.

Wednesday, August 17, 2011

A Rising Crescent?

One advantage of the methodological stability of the Shanghai university rankings is that it is possible to identify long term changes even though the year by year changes may be quite small.

One trend that becomes apparent when comparing the 2003 and 2011 rankings is the increasing number of universities from predominantly Muslim countries.

In 2003 there was exactly one listed, Istanbul University.

This year there were six: King Saud University, Saudi Arabia, in the 201-300 band, King Fahd University of Petroleum and Minerals, Saudi Arabia, Tehran University and Istanbul University in the 301-400 band and Cairo University and Universiti Malaya in the 401-500.

In the next year or two King Abdullah University of Science and Technology, Saudi Arabia, headed by a former president of the National University of Singapore, will probably join the list.

Monday, August 15, 2011

Another Twist in the Plot

The  relationship between Malaysian universities and international rankers would make a good soap opera, full of break-ups, reconciliations and recriminations.

It started in 2004  when the first THES - QS ranking put Universiti Malaya (UM) in the top 100 and Universiti Sains Malaysia in the top 200. There was jubilation at the UM campus with triumphant banners all over the place. Then it all came crashing down in 2005 when QS revealed that they had made a mistake by counting ethnic minorities as international students and faculty. There followed a "clarification of data" and UM was expelled from the top 100.

The  Malaysian opposition believed, or pretended to believe, that this was evidence of the unrelenting decline of the country's universities. The Vice-Chancellor of UM went off into the academic wilderness but still remained on QS's advisory board.

UM continued to pursue the holy grail of a top 200 ranking by the vigorous  pursuit of publications and citations. There was discontent among the faculty voiced in a letter to a local newspaper:

"The writer claimed that many have left UM and “many more are planning to leave, simply because of the expectations from the management”.

“UM is not what it used to be. The academic staff are highly demoralised and unhappy due to the management’s obsession and fixation with ISI publications, while research, consultancy, and contribution to the nation, such as training of PhD students are considered as secondary,” the letter said."

In 2007 the Malaysian government asked for plans from universities to be considered for APEX (Accelerated Program for Academic Excellence) status which would include a substantial degree of university autonomy. It boiled down to a fight between UM and USM, which was won by USM apparently because of its inspiring plans.

'"The selection committee evaluated each university's state of readiness, transformation plan and preparedness for change. The university that is granted apex status is theone that has the highest potential among Malaysian universities to be world-class, and as such, would be given additional assistance to compete with top-ranking global institutions,‘addedKhaled. "Apex is about accelerated change. It is not about business as usual –but business unusual""USM has been working on its own transformation plan –We started with the ‘Healthy Campus’concept, before moving on to the‘University in a garden’concept. We subsequently adopted the ‘Research University’concept."Tan Sri DatoProf DzulkifliAbdul RazakVice Chancellor, USMSelection Committee Chairman, Dr. MohamadZawawi, former Vice Chancellor of Universiti Malaysia Sarawak said the committee also paid special attention to the institutions’strategic intent and transformation plans. Visits were made to short-listed institutions where discussions were held with senior staff, academicians, students and staff associations to understand the prevailing campus’‘climate’and factors related to the proposed plans.With apex status, USM will be given the autonomy to have the best in terms of governance, resources and talent and is expected to move up in the World University Rankings with a target of top 200 in five years and in the top 100, if not 50, by 2020.'

Note that USM was expected to use its status to climb the international rankings. However, it is now refusing to have anything to do with the rankings, something that is understandable.

The issue of which university deserves APEX was reopened this morning when it was announced that Universiti Malaya  was in the top 500 of the Shanghai Academic Ranking of World Universities.

This is unlikely to be a mistake like 2004. The Shanghai rankers have had methodological problems like what to do about merging or splitting universities but they do not change the basic methodology and they do not make serious mistakes. We are not going to hear next year about a clarification of data

UM's success is narrowly based. They have no Nobel prize winners, no highly cited researchers , only a handful of papers in Nature and Science but quite a lot of publications in ISI indexed journals. One might complain that there is too much emphasis on quantity but this is nevertheless a tangible achievement.





 






Press release from Shanghai

Here is the press release from Shanghai Jiao Tong University giving more details about this year's rankings.

Monday, August 15, 2011
Shanghai, People's Republic of China
The Center for World-Class Universities of Shanghai Jiao Tong University released today the 2011 Academic Ranking of World Universities (ARWU), marking its 9th consecutive year of measuring the performance of top universities worldwide.
Harvard University tops the 2011 list; other Top 10 universities are: Stanford, MIT, Berkeley, Cambridge, Caltech, Princeton, Columbia, Chicago and Oxford. In Continental Europe, ETH Zurich (23rd) in Switzerland takes first place, followed by Paris-Sud (40th) and Pierre and Marie Curie (41st) in France. The best ranked universities in Asia are University of Tokyo (21st) and Kyoto University (24th) in Japan.
Three universities are ranked among Top 100 for the first time in the history of ARWU: University of Geneva (73rd), University of Queensland (88th) and University of Frankfurt (100th). As a result, the number of Top 100 universities in Switzerland, Australia and Germany increases to 4, 4 and 6 respectively.
Ten universities first enter into Top 500, among them University of Malaya in Malaysia and University of Zagreb in Croatia enable their home countries to be represented, together with other 40 countries, in the 2011 ARWU list.
Progress of universities in Middle East countries is remarkable. King Saud University in Saudi Arabia first appears in Top 300; King Fahd University of Petroleum & Minerals in Saudi Arabia, Istanbul University in Turkey and University of Teheran in Iran move up in Top 400 for the first time; Cairo University in Egypt is back to Top 500 after five years of staggering outside.
The number of Chinese universities in Top 500 increases to 35 in 2011, with National Taiwan University, Chinese University of Hong Kong, and Tsinghua University ranked among Top 200.
The Center for World-Class Universities of Shanghai Jiao Tong University also released the 2011 Academic Ranking of World Universities by Broad Subject Fields (ARWU-FIELD) and 2011 Academic Ranking of World Universities by Subject Field (ARWU-SUBJECT).Top 100 universities in five broad subject fields and in five selected subject fields are listed, where the best five universities are:
Natural Sciences and Mathematics – Harvard, Berkeley, Princeton, Caltech and Cambridge
Engineering/Technology and Computer Sciences – MIT, Stanford, Berkeley, UIUC and Georgia Tech
Life and Agriculture Sciences – Harvard, MIT, UC San Francisco, Cambridge and Washington (Seattle)
Clinical Medicine and Pharmacy – Harvard, UC San Francisco, Washington (Seattle), Johns Hopkins and Columbia
Social Sciences – Harvard, Chicago, MIT, Berkeley and Columbia
Mathematics – Princeton, Harvard, Berkeley, Stanford and Cambridge
Physics – MIT, Harvard, Caltech,Princeton and Berkeley
Chemistry – Harvard, Berkeley, Stanford, Cambridge and ETH Zurich
Computer Science – Stanford, MIT, Berkeley, Princeton and Harvard
Economics/Business – Harvard, Chicago, MIT, Berkeley and Columbia
The complete listsand detailed methodologies can be found at the Academic Ranking of World Universities website at http://www.ShanghaiRanking.com/.
Academic Ranking of World Universities (ARWU): Starting from 2003, ARWU has been presenting the world top 500 universities annually based on a set of objective indicators and third-party data. ARWU has been recognized as the precursor of global university rankings and the most trustworthy list. ARWU uses six objective indicators to rank world universities, including the number of alumni and staff winning Nobel Prizes and Fields Medals, number of highly cited researchers selected by Thomson Scientific, number of articles published in journals of Nature and Science, number of articles indexed in Science Citation Index - Expanded and Social Sciences Citation Index, and per capita performance with respect to the size of an institution. More than 1000 universities are actually ranked by ARWU every year and the best 500 are published.
Center for World-Class Universities of Shanghai Jiao Tong University (CWCU): CWCU has been focusing on the study of world-class universities for many years, published the first Chinese-language book titled world-class universities and co-published the first English book titled world-class universities with European Centre for Higher Education of UNESCO. CWCU initiated the "International Conference on World-Class Universities" in 2005 and organizes the conference every second year, which attracts a large number of participants from all major countries. CWCU endeavors to build databases of major research universities in the world and clearinghouse of literature on world-class universities, and provide consultation for governments and universities.
Contact: Dr.Ying CHENG at ShanghaiRanking@gmail.com
Breaking News

The Shanghai rankings are out. Go here.

One interesting result is that Universiti Malaya is in the top 500 for the first time, mainly because of  a large number of publications.

Thursday, August 11, 2011

America's Best Colleges

As the world waits anxiously for the publication of Princeton Review's Stone Cold Sober School Rankings (like everybody else I am praying the US Air Force Academy stays in the top ten), there are a few less important rankings like Shanghai's ARWU or the Forbes/CCAP (Center for College Affordabilty and Productivity) Rankings to study.

The latter, which have just been released, are designed for student consumers. The components are student satisfaction, post-graduate success, student debt, four-year graduation rate and competitive awards. They clearly fulfill a need that other rankings do not. It is possible that some of the indicators could be adopted for an international ranking. The top 10, a mix of Ivy League schools, small liberal arts colleges and service academies, are:

1.  Williams College
2.  Princeton
3.  US Military Academy
4.  Amherst College
5. Stanford
6.  Harvard
7.  Haverford College
8.  Chicago
9.  MIT
10. US Air Force Academy

Tuesday, August 09, 2011

The Missing Indicator

There is an interesting item in Times Higher Education. Apparently leading universities might be offering financial inducements in the form of scholarships to students with AAB at A level. For Americans that would be something like a 3.9 GPA.

Universities like to proclaim that they provide something to students that enables them to succeed in the post university world, training in critical thinking, soft skills or exposure to a diverse multicultural society for which massive tuition fees can be extracted from parents or government.  Whether they actually do is debatable. A book length study by Richard Arum and Josipa Roksa, Academically Adrift,  indicated that universities do little to teach students how to think.

Employers seem to have a rather different view of the matter. Many restrict themselves to recruiting from only the elite universities and are quite unconcerned with whether universities have established learning outcomes or whether they have created a safe space for diverse students. They simply wish to recruit the most intelligent students that they  can, with perhaps a bit of charm and likability for publicly visible positions.

Left to their own devices, universities would probably do their best to admit the most intelligent students available. The news that some are willing to pay for those with good A levels is a stark reminder that they are not being entirely honest in claiming that they provide an excellent education that is worth paying for. If that were the case, why not just put in a bit more effort and a few thousand pounds to turn an BBB or CCC student into an AAB. The answer is that while you might get a bit more out of a student by teaching reading and writing skills, basic numeracy and so on, recruiting from the top of the cognitive scale will bring much more to a university.

It is likely that calculating the average academic ability of students, or better still, their underlying intelligence might be an extremely valuable component in any international ranking system

Research Blogs has a table, derived from data from the Higher Education Funding Council of England,  of universities according to the percentage of UK students with AAB or better at A levels.

If it were possible to calculate equivalences between the standardised tests or qualifications in various countries with some sort of adjustment for national differences in education or literacy, then an global ranking of universities according to student quality would be quite feasible.


The second column in this table shows the position of the institution when ranked by size.
Ranked by percentage AAB+ UK %
1 3 University of Oxford 2568 99
2 4 University of Cambridge 2554 99
3 18 Imperial College London 1094 96
4 23 London School of Economics 686 93
5 2 University of Durham 2581 85
6 8 University of Bristol 2199 85
7 13 University College London 1648 82
8 9 University of Warwick 2068 81
9 7 University of Exeter 2368 74
10 15 University of Bath 1496 69
11 17 King's College 1238 68
12 57 Royal Veterinary College 195 60
13 86 Conservatoire for Dance & D 71 59
14 35 SOAS 353 57
15 5 University of Nottingham 2505 57
16 12 University of Southampton 1686 54
17 14 University of York 1538 53
18 1 University of Manchester 2776 51
19 11 University of Sheffield 1846 49
20 10 University of Birmingham 1883 48
21 6 University of Leeds 2376 47
22 16 University of Newcastle 1332 43
23 19 Loughborough University 1042 38
24 91 School of Pharmacy 59 37
25 21 Lancaster University 740 32
26 25 Royal Holloway, London 589 32
27 20 University of Liverpool 886 32
28 33 City University 414 31
29 26 University of Leicester 578 30
30 22 Queen Mary 702 29
31 24 University of Sussex 630 29
32 32 Aston University 429 25
33 27 University of East Anglia 538 25
34 93 Blackpool and the Fylde 52 24
35 30 University of Surrey 433 23
36 95 Blackburn College 51 22
37 29 University of Reading 455 19
38 58 Goldsmiths College 180 17
39 72 University of Chichester 114 13
40 37 Brunel University 325 12
41 84 University College Falmouth 78 11
42 31 University of the Arts London 430 11
43 28 University of Kent 480 10
44 76 University of Cumbria 105 10
45 89 Arts UC at Bournemouth 67 9
46 67 Bath Spa University 144 9
47 55 University of Essex 200 8
48 56 University of Teesside 197 8
49 43 Southampton Solent 254 8
50 69 University Creative Arts 136 8
51 48 University of Lincoln 232 8
52 68 University of Worcester 142 8
53 80 Liverpool Hope University 97 8
54 42 Coventry University 262 7
55 50 Bournemouth University 227 7
56 54 Oxford Brookes University 213 7
57 63 University of Bedfordshire 163 7
58 64 University of East London 157 7
59 79 Edge Hill University 97 7
60 51 University of Brighton 215 7
61 39 University of Northumbria 283 7
62 41 University of Plymouth 278 7
63 62 London South Bank 165 7
64 60 Birmingham City University 170 6
65 75 University of Northampton 110 6
66 36 Sheffield Hallam University 340 6
67 65 Anglia Ruskin University 156 6
68 34 Nottingham Trent University 357 6
69 49 University of Hull 228 6
70 78 Keele University 99 6
71 77 University of Chester 100 6
72 52 University of Huddersfield 214 6
73 40 Kingston University 280 6
74 44 University West of England 238 5
75 46 University of Westminster 235 5
76 70 University of Derby 133 5
77 92 University of Bolton 55 5
78 73 University of Bradford 112 5
79 87 Thames Valley University 68 5
80 38 Manchester Metropolitan 290 5
81 47 De Montfort University 232 5
82 45 London Metropolitan 237 5
83 53 Liverpool John Moores 213 5
84 71 Middlesex University 126 5
85 61 Central Lancashire 168 4
86 94 Roehampton University 51 4
87 83 University of Sunderland 79 4
88 90 University of Gloucestershire 65 4
89 59 University of Greenwich 178 3
90 66 University of Portsmouth 152 3
91 74 Leeds Metropolitan 111 3
92 88 University of Salford 67 3
93 85 University of Wolverhampton 76 2
94 82 University of Hertfordshire 85 2
95 81 Staffordshire University 91 2





















Worth Reading

University World News has some new articles on rankings by Ellen Hazelkorn, Philip Altbach and Danny Byrne of QS. It also provides links to several older articles.

Tuesday, August 02, 2011

Ranking Fever Rages in the US

Forget Shanghai, QS and Times Higher. The really important ranking has just been released by Princeton Review.

Ohio University in Athens has replaced the University of Georgia in Athens as the top party school (one wonders if all the respondents could remember where they were).

Other number ones are:

Amazing College Campus: Elon University, NC
Top Online University: Penn State University World Campus 

And more to come.