Sunday, September 02, 2018

Ranking US Rankings

Forbes Magazine has an article by Willard Dix that ranks US  ranking sites. The ranking is informal without specifying indicators but the author does give us an idea of what he thinks a good ranking should do.

Here are the top five of thirteen:
1.  US News: America's Best Colleges
2.  Money magazine: Best Colleges Ranking
3.  Forbes: America's Top Colleges
4.  Kiplinger's Best College Values
5.  Washington Monthly: College Guide and Rankings.

Reading through the comments it is possible to get an idea of the criteria of a good ranking. Rankings should contain a lot of information, they should be comprehensive and include a large number of institutions, they should provide data that helps prospective students and stakeholders, they should be published for several years, if they use surveys they should have a lot of respondents, they should have face validity (a list with a "revolutionary algorithm" that puts non-Ivy places at the top is in 13th place). 





Friday, August 24, 2018

Why is Australia doing well in the Shanghai rankings?

I am feeling a bit embarrassed. In a recent post I wrote about the Shanghai Rankings (ARWU) being a bit boring (which is good) because university ranks usually do not change very much. But then I noticed that a couple of Australian universities did very well in the latest rankings. One of them, the Australian National University (ANU), has risen a spectacular (for ARWU) 31 places over last year. The Financial Review says that "[u]niversity scientific research has boosted the position of two Australian universities in a global ranking of higher education providers." 

The ranking is ARWU and the rise in the ranking is linked to the economic contribution of Australian universities, especially those in the Group of Eight.

So how well did Australian universities do? The top performer, as in previous years, is the University of Melbourne, which went up a spot to 38th place. Two other universities went up a lot in a very un-Shanghainese way, ANU, already mentioned, from 69th to 38th place and the University of Sydney from  to 83rd to  68th

The University of Queensland was unchanged in 55th place while Monash fell from 78th to 91st  and the University of Western Australia from 91st to 93rd. 

How did ANU and Sydney do it? The ANU scores for Nobel and Fields awards were unchanged. Publications were up a bit  and papers in Nature and Science down a bit.  

What made the difference was the score for highly cited researchers, derived from lists kept by Clarivate Analytics, which rose from 15.4 to 23.5, a difference of 8.1 or, after weighting, 1.62 points of the overall score. The difference in total scores between 2017 and 2018 was 1.9 so those highly cited researchers made up most of the difference.

In 2016 ANU had two researchers in the list, which was used for the 2017 rankings. One was also on the 2017 list, used in 2018. In 2017 there were six ANU  highly cited researchers, one from the previous year and one who had moved from MIT. The other four were long serving ANU researchers.

Let's be clear. ANU has not been handing out unusual contracts or poaching from other institutions. It has grown its own researchers and should be congratulated.

But using an indicator where a single researcher can lift a top 100 university seven or eight places is an invitation to perverse consequences. ARWU should consider whether it is time to explore other measures of research impact.

The improved scores for the University of Sydney resulted from an increase between 2016 and 2017 in the number of articles published in the Science Citation Index Expanded and the Social Science Citation Index.

Saturday, August 18, 2018

Who Cares About University rankings?

A paper by Ludo Waltman and Nees Jan van Eck asks what users of the Leiden Ranking are interested in. There's some interesting stuff but for now I just want to look at where the users come from.

The top ten countries where visitors originate are:

1.  USA
2.  Australia
3.  Netherlands
4.  UK
5.  Turkey
6.  Iran
7.  South Korea
8.  France
9.  Germany
10. Denmark.

The authors consider the number of visitors from Australia, Turkey, Iran and South Korea to be "quite remarkable."

Let's look at other signs of interest in rankings. Here are the top countries for respondents to the 2018 QS academic survey:

1.  USA
2.  UK
3.  Malaysia
4= Australia
4= South Korea
4= Russia
7= Italy
7= Japan
9= Brazil
9= Canada

And here are the top ten countries for visitors to this blog:

1. USA
2. UK
3. Russia
4. France
5. Germany
6. Ukraine
7. Canada
8. Malaysia
9. Australia
10. Singapore.

The three countries on all three lists are UK, USA and Australia. The countries on two lists are South Korea, Russia, Malaysia, France, Germany and Canada.










https://www.cwts.nl/blog?article=n-r2s2a4&title=what-are-users-of-the-cwts-leiden-ranking-interested-in

http://rankingwatch.blogspot.com/2018/06/responses-to-qs.html

Saturday, August 11, 2018

Will THE do something about the citations indicator?


International university rankings can be a bit boring sometimes. It is difficult to get excited about the Shanghai rankings, especially at the upper end: Chicago down two places, Peking up one. There was a bit of excitement in 2014 when there was a switch to a new list of highly cited researchers and some universities went up and down a few places, or even a few dozen, but that seems over with now.

The Times Higher Education (THE) world rankings are always fun to read, especially the citations indicator, which since 2010 has proclaimed a succession of unlikely places as having an outsize influence on the world of research: Alexandria University, Hong Kong Baptist University, Bilkent University, Royal Holloway University of London, National Research University MEPhi Moscow, Tokyo Metropolitan University, Federico Santa Maria Technical University Chile, St George's University of London, Anglia Ruskin University Cambridge, Babol Noshirvani University University of Technology Iran.

I wonder if the good and the great of the academic world ever feel uncomfortable about going to those prestigious THE summits while places like the above are deemed to be the equal for research impact or the superior of Chicago or Melbourne or Tsinghua. Do they even look at the indicator scores?

These remarkable results are not because of deliberate cheating but of THE's methodology. First, research documents are divided into 300 plus fields, five types of documents, and five years of publication, and then the world average number of citations (mean) is calculated for each type of publication in each field and in each year. Altogether there are 8000 "cells" with which the average of each university in the THE rankings is compared .

This means that if a university manages to get a few publications in a field where citations are typically low it could easily get a very high citations score. 

Added to this is a "regional modification" where the final citation impact score is divided by the square root of the score of the country in which the country is located. This results in most universities receiving an increased score which is very small for those in productive countries and very high for those in countries that generate few citations. The modification is now applied to half of the citations indicator score.

Then we have the problems of those famous kilo-author mega-cited papers. These are papers with dozens, scores, or hundreds of participating institutions and similar numbers of authors and citations. Until 2015 THE treated every author as as though they were the sole author of a paper, including those with thousands of authors. Then in 2015 they stopped counting papers with over a thousand authors and in 2016 they introduced a modified fractional counting of citations for papers with over thousand authors. Citations were distributed proportionally among the authors with a minimum allotment of five per cent.

There are problems with all of these procedures. Treating every  authors as as the sole author meant that a few places can get massive citation counts from taking part in one or two projects such as the CERN project or the global burden of disease study . On the other hand excluding mega papers is also not helpful since it omits some of the most significant current research.

The simplest solution would be fractional counting all around, just dividing the number of citations of all papers by the numbers of contributors or contributing institutions. This is the default option of Leiden Ranking and there seems no compelling reason why THE could not so.

There are some other issues that should be dealt with. One is the question of self-citation. This is probably not a widespread issue but it has caused problems on a couple of occasions.

Something else that THE might want to think about is the effect of the rise of in the number of authors with multiple affiliations. So far only one university has recruited large numbers of adjunct staff whose main function seems to be  listing the university as a secondary affiliation at the top of published papers but there could be more in the future. 

Of course, none of this would matter very much if the citations indicator were given a reasonable weighting of, say, five or ten percent but it has more weight than any other indicator -- the next is the research reputation survey with 18 %. A single mega-paper or even a few strategically placed citations in a low cited field can have a huge impact on a university's overallscore.

There are signs that THE is getting embarrassed at the bizarre effects of this indicator. Last year Phil Baty, THE's ranking editor,  spoke about its quirky results. 

Recently, Duncan Ross, data director at THE, has written about the possibility of of a methodological change. He notes that currently the  benchmark world score for the 8000 plus cells  is determined by the mean. He speculates about using the median instead. The problem with this is that a majority of papers are never cited so the median for many of the cells is going to be zero. So he proposes, based on an analysis from the recent THE Latin American rankings, that the 75th percentile be used. 

Ross suggests that this would make the THE rankings more stable, especially the Latin American rankings where the threshold number of articles is quite low. 

It would also allow the inclusion of more universities that currently fall below the threshold. This, I suspect, is something that is likely to appeal to the THE management.

It is very good that THE appears willing to think about reforming the citations indicator. But a bit of tweaking will not be enough. 





Sunday, July 22, 2018

The flight from bright: Dartmouth wants nice MBA students

The retreat from intelligence as a qualification for entrance into American universities continues. We have already seen the University of Chicago join the ranks of test-optional colleges and it seems that for many years Harvard has been discriminating against prospective Asian students who supposedly lack the qualities of grit, humour, sensitivity, kindness, courage, and leadership that are necessary to study physics or do research in economics.

There has been a lot of indignation about the implication that Harvard should actually think that Asians were uniquely lacking in humour and grit and so on.

But even if Asians were lacking in these qualities that is surely no reason to deny them admission to elite institutions if they have the ability to perform at the highest intellectual level. Sensitivity, kindness and a sense of humour etc are no doubt desirable but they are highly subjective, culture specific, difficult to operationalise and almost impossible to assess with any degree of validity. They also could have a disparate impact on racial, gender and ethnic groups.

Now Dartmouth College is going down the same path. What do you need to get into the Tuck School of Business?

"True to the school’s long-held reputation for being applicant-friendly and transparent in its admissions process, the new, simplified criteria comprise four attributes reflective of successful Tuck students: smart, nice, accomplished, and aware."

I doubt that Dartmouth will be the only place to admit students because they are nice, or good at pretending to be nice or able to afford niceness trainers. And how will niceness be assessed?

There will be an essay: "Tuck students are nice, and invest generously in one another's success. Share an example of how you have helped someone else succeed. (500 words)."

Referees will be asked: "Tuck student are nice. Please  comment on how the candidate interacts with others including when the interaction is difficult or challenging."

Soon no doubt we will hear demands for niceness of students to be included as in indicator in university rankings. There will be compulsory workshops on how to confront the nastiness within. Studies will show that niceness is an essential attribute for success in research, business, sport, war and journalism and that it is something in which ciswhitestraightmales, especially those not differently abled, are desperately deficient.

And we are likely to see articles wondering why Asian universities are mysteriously overtaking the West in anything based on cognitive skills. 




View Printable Version
My article in University World News can be accessed here. Comments can be made at this blog.


GLOBAL
How should rankings assess teaching and learning?

Tuesday, July 17, 2018

Chicago goes test-optional

The University of Chicago has gone test-optional. Prospective students will no longer be required to submit their SAT or ACT scores when applying, although probably most will continue to do so.

Many colleges in the US have done this already and candidates who choose not to submit test scores are admitted on the basis of high school grades, perceived personal attributes, recommendations, essays, extra-curricular activities and/or membership of a valued group. Most 
of these, but not all, appear to be small liberal arts colleges. Chicago is the first major US research university to do so but is unlikely to be the last.

A common justification is that dropping the test requirement allows universities to recruit students from disadvantaged or underrepresented groups who may not do well on standardised tests. 
Perhaps it does but there is also a less altruistic reason. Going test-optional might help Chicago to maintain or even improve its position in the US News rankings while allowing the overall academic ability of its students to slide.

If the students who choose not to submit test scores are scoring below average then the overall test scores will rise which will improve Chicago's standing in the rankings. Apparently it is US News policy to avoid penalising institutions as long as 75% of the incoming class submit their scores. Also, if they get more applicants then the admission rate goes down and the university appears more selective. All in all, it looks like a win-win situation. But as more students are selected because they can produce a two minute video, are members of a protected group, or voice support for current orthodoxies, overall academic quality will gradually drift downwards. 


There are signs that higher education in the West is moving away from the objective standardised testing of academic ability. It is likely that those admitted because they are likeable or passionate, show leadership qualities or bring new perspectives to the classroom will find the cold realities of advanced physics or philosophy frustrating and will demand that standards be adjusted to accommodate them.

Meanwhile the long slow convergence of America and China will continue. China is now level with the US for many measures of research output and parity in quality will  probably come soon. If American schools abandon the rigorous selection of students, teachers and researchers they are likely to fall behind.


Wednesday, July 11, 2018

Should Malaysian universities celebrate rising in the QS Rankings?


Should Malaysian universities celebrate rising in the QS Rankings?

My article in the Kuala Lumpur New Straits Times can be accessed here. You can post comments at this blog.


Saturday, July 07, 2018

The THE European Teaching Rankings

On July 11th Times Higher Education (THE) will publish their new European university rankings. These are supposed to be about teaching and seem to give priority to students as consumers of higher education.

They are similar to THE's Japanese and US rankings with four "pillars": Engagement (five indicators derived from the European Student Survey), Resources (three indicators), Outcomes (three indicators) and Environment, which consists entirely of the gender ratio of faculty and students.

THE are presenting these rankings as an innovative pilot project so they might contain interesting insights lacking in other international rankings. But it looks like THE will follow previous practice and only give scores for the four pillars and not for the component indicators. This would drastically reduce their value for students and other indicators since it would be difficult or impossible to figure exactly what has contributed to a high or a low score for any of the pillars.

Although the rankings claim to assess teaching, there is still a substantial research component here. Papers to staff ratio gets a weighting of 7.5%, and THE's survey of postgraduate teaching, which correlates very closely with the research survey, gets 10%.

What is missing here is any serious measure of the quality of students or graduates. This is the great omission of the current global ranking scene. QS have a survey of employers and CWUR counts the prizes won by university alumni. Neither of these are relevant for the great majority of institutions around the world.

The most valuable metrics in the US News national ranking are the test scores and high school standing of admitted students. The blunt reality is that employers and graduate and professional schools are interested in the cognitive skills, subject knowledge, conscientiousness and, sadly and increasingly, the willingness to conform of graduates and the ability to universities to nurture these is closely related to students' performance on standardised tests and national exams. It is disappointing that THE have been unable to find a way of capturing the quality of students and graduates.

It is also odd that THE are able to supply data on only one aspect of institutional environment, that is gender ratio.

U-Multirank already covers some of the indicators included in the new rankings and has a reasonable coverage of European universities. Whether THE can do better will be seen on the eleventh.



Wednesday, June 27, 2018

USA has the world's most powerful computer but China still holds the lead for supercomputing

TOP500 has been compiling lists of the world's most powerful supercomputers for the last quarter of a century. Back in June 1993 the world's most powerful computer was Numerical Wind Tunnel at Fujitsu National Aerospace Laboratory of Japan followed by CM-5/1024, Thinking Machines Corporation Los Alamos Laboratory in the USA. 

In that  year the top 500 included 232 in the US, 115 in Japan, 56 in Germany, 25 in France and 24 in the UK. There were three in Taiwan but none in Mainland China.

The first Chinese supercomputer did not appear in the top 500 until November 1999.

First forward to June 2018. There are now 206 supercomputers in China, up from 202 last November, 124 in the US, down from 143, 36 in Japan, up from 35, 22 in the UK, up from 15, 21 in Germany, no change, 18 in France, also no change.

So China's displacement of the US continues but there is one ray of hope. The world's most powerful supercomputer is in the US for the first time in five and a half years: Summit, at the Department of Energy's  Oak Ridge National Laboratory. But that seems small compensation for the growing gap between China and the US.

The latest ranking also shows that large areas of the world are computer deserts. In the whole of Africa there is only supercomputer, in South Africa. There are only four in the Islamic world, all in Saudi Arabia, three of them at Aramco. There is one in Latin America, in Brazil.


Tuesday, June 19, 2018

Are the US and the UK really making a comeback?

The latest QS World University Rankings and the THE World Reputation Rankings have just been published. The latter will feed into the forthcoming world rankings where the two reputation indicators, research and postgraduate teaching, will account for 33 per cent of the total weighting, 

The THE reputation rankings include only 100 universities. QS is now ranking close to 1,000 universities and provides scores for 500 of them including academic reputation and employer reputation.

The publication of these rankings has led to claims that British and American universities are performing well again after  a period of stress and difficulty. In recent years we have heard a great deal about the rise of Asia and the decline of the West. Now it seems that THE and QS are telling us that things are beginning to change.

The rise of Asia has perhaps been overblown but if Asia is narrowly as Northeast Asia and Greater China then there is definitely something going on. Take a look at the record of Zhejiang University in the Leiden Ranking publications indicator. In 2006-9 Harvard produced a total of 27,422 papers and Zhejiang University 11,173. In the period 2013-16 the numbers were 33,045 for Harvard and 20,876 for Zhejiang. In seven years Zhejiang has gone from 42% of Harvard's score to 63%. It is not impossible that Zhejiang will reach parity within two decades.

We are talking about quantity here. Reaching parity for research of the highest quality and the greatest impact will take longer but here too it seems likely that within a generation universities like Peking, Zhejiang, Fudan, KAIST and the National University of Singapore will catch up with and perhaps surpass the Ivy League, the Russell Group and the Group of Eight.

The scientific advance of China and its neighbours is confirmed by data from a variety of sources, including the deployment of supercomputers,  the use of robots, and, just recently,  the Chinese Academy of Science holding its place at the top of the Nature Index.

There are caveats. Plagiarism is a serious problem and the efficiency of Chinese research culture is undermined by  cronyism and political conformity. But these are problems that are endemic, and perhaps worse, in Western universities.

So it might seem surprising that the two recent world rankings show that American and British universities are rising again. 

But perhaps it should not be too surprising. QS and THE emphasise reputation surveys, which have a weighting of 50% in the QS world rankings and 33% in THE's. There are signs that British and American universities and others in the Anglosphere are learning the reputation management game while universities in Asia are not so interested.

Take a look at the the top fifty universities in the QS academic reputation indicator, which is supposed to be about the best universities for research. The countries represented are:
US 20
UK 7
Australia 5
Canada 3
Japan 2
Singapore 2
China 2
Germany 2.

There is one each for Switzerland, Hong Kong, South Korea, Mexico, Taiwan, France and Brazil.

The top fifty universities in the QS citations per faculty indicator, a measure of research excellence, are located in:
USA 20
China 4
Switzerland 4
Netherlands 3
India  2
Korea 2
Israel 2
Hong Kong 2
Australia 2.

There is one each from Saudi Arabia, Italy, Germany, UK, Sweden, Taiwan, Singapore and Belgium.

Measuring citations is a notoriously tricky business and probably some of the high flyers in the reputation charts are genuine local heroes little known to the rest of the world. There is also now a lot of professional advice available about reputation management for those with cash to spare. Even so it is striking that British, Australian, and Canadian universities do relatively well on reputation in the QS rankings while China, Switzerland, the Netherlands, India and Israel do relatively well for citations.

For leading British universities the mismatch is very substantial. According to the 2018-19 QS world rankings, Cambridge is 2nd for academic reputation, 71st for citations, Manchester is 33rd and 221st, King's College London 47th and 159th, Edinburgh 24th and 181st. It is not surprising that British universities should perform well in rankings where there is a 40 % weighting for reputation.

The THE reputation rankings have produced some good results for several US universities.
UCLA has risen from 13th to 9th    
Cornell from 23rd to 18th                      
University of Washington from 34th to 28th                
University of Illinois Urbana-Champaign from 36th to 32nd            Carnegie Mellon from 37th to 30th                    
Georgia Institute of Technology from 48th to 44th.                             
Some of this is probably the result of a change in the distribution of survey responses. I have already pointed out that the fate of Oxford in the THE survey rankings is tied to the percentages of responses from the arts and humanities. THE have reported that their survey this year had an increased number of responses from computer science and engineering and a reduced number from the social sciences and the humanities. Sure enough, Oxford has slipped slightly while LSE has fallen five places. 

The shift to computer science and engineering in the THE survey might explain the improved reputation of Georgia Tech and Carnegie Mellon. There is, I suspect, something else going on and that is the growing obsession of some American universities with  reputation management, public relations and rankings, including the hiring of professional consultants.

In contrast, Asian universities have not done so well in the THE reputation rankings.

University of  Tokyo has fallen from 11th to 13th place    
University of Kyoto from 25th to 27th      
Osaka University from 51st to 81st         
Tsinghua University is unchanged in 14th  
Peking University 17 unchanged in 17th   
Zhejiang University has fallen from the 51-60 band to 71-80          University of Hong Kong has fallen from 39th to 40th.        

All but one of the US universities have fallen in the latest Nature Index, UCLA by 3.1%, University of Washington 1.7%, University of Illinois Urbana-Champaign 12%, Carnegie Mellon 4.8%, Georgia Tech 0.9%.

All but one of the Asian universities have risen in the Nature Index, Tokyo by 9.2%, Kyoto 15.1%, Tsinghua 9.5%, Peking 0.9%, Zhejiang 9.8%, Hong kong 25.3%.

It looks like that Western and Asian universities are diverging. The former are focussed on branding, reputation, relaxing admission criteria, searching for diversity. They are increasingly engaged with, or even obsessed with, the rankings.

Asian universities, especially in Greater China and Korea, are less concerned with rankings and public relations and more with academic excellence and research output and impact. 

As the university systems diverge it seems that two different sets of rankings are emerging to cater for the academic aspirations of different countries.












Nature Index: Is This the Future of Science?


The Nature Index ranks countries and institutions according to their publications in the most highly reputed scientific journals. It is  a reliable guide to performance at the highest levels of research.
Here are the academic institutions in the current top 100 that have risen or fallen by ten per cent or more in the latest edition. The indicator is adjusted fractional count 2016-2017.

The 2018 world  rank is on the left. The percentage increase or decrease is on the right.

I think I see a few patterns here.

Rising Institutions

14.  National Institutes of Health, USA   10.0%
20.  Kyoto University  15.1%
31.  University of Chinese Academy of Sciences 64.8%
37.  National University of Singapore  10.5%
41.  Indian Institutes of Technology (all of them)  28%
44.  Fudan University 11.1%
61.  Texas A and M University  23.7%
62.  Shanghai Jiao Tong University 30.4%
68.  Wuhan University 31.3%
69.  University of Edinburgh 11.5%
72   University of Bristol  25.3%
74.  University of Texas Southwestern Medical Center 20.5%
76.  Sun Yat-sen University, China 26.6%
81.  Xiamen University 18.7%
86.  University of Utah   22.2%
95.  Sichuan University  24%
98.  Wurzburg Universit 19.7

Falling Institutions

11.  University of Oxford -15.2%
24.  Yale University  -13.6%
38.  University of Illinois Urbana Champagne  -12%
43.  EPF Lausanne -11.2%
47.  University of Minnesota   -15.5%
55.  Leibniz Association, Germany  -10%
57.  Duke University -15.3%
82.  Mcgill University -15.9%
84.  Tohoku University -18.3%
88.  Rutgers University -17.3%
89.  Technical University Munich  -11.6%
91,  University of Zurich   -11.8%
93.  NASA, USA  -16.5%







Monday, June 18, 2018

Responses to the QS Academic Survey


QS has published the percentage of responses to this year's academic survey (which is about the best universities for research) from different countries. 

The table below combines the percentages in the surveys of 2007, 2013 and 2018 (for the "2019" rankings) ranked by the percentages for 2013. The data for 2009 and 2013 are from a previous post.

There are some interesting things here. The UK has more responses than France and Germany combined. 

There are more responses from Malaysia than from China.

There are more responses from Kazakhstan than from India.

There are almost as many responses from Australia, Canada and New Zealand as there are from the USA.

The number of responses from these countries has risen by a percentage point or more since 2013: Russia, Malaysia, Iraq, Kazakhstan.

The number of responses from these countries has fallen by a percentage point or more since 2013: USA, Brazil, Italy, Germany, Hungary. 


Table: Percentage of responses to QS academic survey.


country
2007
2013
2018
USA
10.0
17.4
8.5
UK
5.6
6.5
7.0
Brazil
1.1
6.3
3.3
Italy
3.3
4.7
3.5
Germany
3.0
3.8
2.5
Canada
4.0
3.4
3.3
Australia
3.5
3.2
4.0
France
2.9
2.4
2.0
Japan
1.9
2.9
3.5
Spain
2.3
2.7
3.1
Mexico
0.8
2.6
2.3
Hungary
--
2.0
0.9
Russia
0.7
1.7
4.0
India
3.5
1.7
2.6
Chile
--
1.7
2.0
Ireland
1.5
1.6
0.9
Malaysia
3.2
1.5
4.6
Belgium
2.6
1.4
0.7
Hong Kong
1.9
1.4
1.5
Taiwan
0.7
1.3
2.0
Netherlands
0.6
1.2
0.9
New Zealand
4.1
1.2
1.0
Singapore
2.5
1.2
0.8
China
1.6
1.1
1.8
Portugal
0.9
1.1
1.0
Colombia
--
1.1
1.6
Argentina
0.7
1.0
0.8
South Africa
0.7
1.0
0.8
Denmark
1.2
0.9
0.5
Sweden
1.7
0.9
0.8
Switzerland
1.5
0.8
0.7
Austria
1.3
0.8
0.5
Turkey
1.1
0.7
0.8
Indonesia
1.2
0.5
0.9
Philippines
1.8
0.3
0.5
Iraq
--
0.2
1.4
Kazakhstan
--
0.9
3.0
South Korea
?
?
4.0