John Field, an expert on lifelong learning, comments on the growing Brexit hysteria blowing through academia.
Professor Field quotes the Vice Chancellor of the University of York:
"York, along with many other British universities, appears to have fallen in the QS league table because of concerns about the impact of Brexit; specifically, this has been attributed to worries about future access to research funding and whether we will be able to recruit excellent academic staff and students from all over the world."
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Wednesday, September 07, 2016
The shadow of Brexit falls across the land
The western chattering and scribbling classes sometimes like to reflect on their superiority to the pre-scientific attitudes of the local peasantry, astrology, nationalism and religion and things like that. But it seems that the credentialled elite of Britain are now in the grip of a great fear of an all pervading spirit called Brexit whose malign power is unlimited in time and space.
Thus the Independent tells us that university rankings (QS in this case) show that "post Brexit uncertainty and long-term funding issues" have hit UK higher education.
The Guardian implies that Brexit has something to do with the decline of British universities in the rankings without actually saying so.
"British universities have taken a tumble in the latest international rankings, as concern persists about the potential impact of Brexit on the country’s higher education sector. "
Many British universities have fallen in the QS rankings this year but the idea that Brexit has anything to do with it is nonsense. The Brexit vote was on June 23rd, well after QS's deadlines for submitting respondents for the reputation surveys and updating institutional data. The citations indicator refers to the period 2011-2015.
The belief that rankings reveal the dire effects of funding cuts and immigration restrictions is somewhat more plausible but fundamentally untenable.
Certainly, British universities have taken some blows in the QS rankings this year. Of the 18 universities in the top 100 in 2015 two are in the same place this year, two have risen and 14 have fallen. This is associated with a general decline in performance in the academic reputation indicator which accounts for 40% of the overall score.
Of those 18 universities three, Oxford, Cambridge and Edinburgh, hold the same rank in the academic reputation indicator, one, King's College London, has risen and fourteen are down.
The idea that the reputation of British universities is suffering because survey respondents have heard that the UK government is cutting spending or tightening up on visa regulations is based on some unlikely assumptions about how researchers go about completing reputation surveys.
Do researchers really base their assessment of research quality on media headlines, often inaccurate and alarmist? Or do they make an honest assessment of performance over the last few years or even decades? Or do they vote according to their self interest, nominating their almae matres or former employers?
I suspect that the decline of British universities in the QS reputation indicator has little to do with perceptions about British universities and a lot more to do with growing sophistication about and interest in rankings in the rest of the world, particularly in East Asia and maybe parts of continental Europe.
What was that about the origins of science in seventeenth century England?
Trigger warning
If you're triggered by just about anything, don't read this.
Those who dislike inherited privilege will be entertained by this account of the last days of Charles II. it is from a post by Gregory Cochran at the blog, West Hunter.
It seems that there has been a little bit of progress over the centuries. The future Charles III has a thing about homeopathy, expensive pseudoscientific rubbish but at least it's harmless.
I can't help wondering whether the malign spirit of pseudoscience has now taken refuge in university faculties of social science with their endless crises of irreproducible research.
"Back in the good old days, Charles II, age 53, had a fit one Sunday evening, while fondling two of his mistresses.
Monday they bled him (cupping and scarifying) of eight ounces of blood. Followed by an antimony emetic, vitriol in peony water, purgative pills, and a clyster. Followed by another clyster after two hours. Then syrup of blackthorn, more antimony, and rock salt. Next, more laxatives, white hellebore root up the nostrils. Powdered cowslip flowers. More purgatives. Then Spanish Fly. They shaved his head and stuck blistering plasters all over it, plastered the soles of his feet with tar and pigeon-dung, then said good-night.
If you're triggered by just about anything, don't read this.
Those who dislike inherited privilege will be entertained by this account of the last days of Charles II. it is from a post by Gregory Cochran at the blog, West Hunter.
It seems that there has been a little bit of progress over the centuries. The future Charles III has a thing about homeopathy, expensive pseudoscientific rubbish but at least it's harmless.
I can't help wondering whether the malign spirit of pseudoscience has now taken refuge in university faculties of social science with their endless crises of irreproducible research.
"Back in the good old days, Charles II, age 53, had a fit one Sunday evening, while fondling two of his mistresses.
Monday they bled him (cupping and scarifying) of eight ounces of blood. Followed by an antimony emetic, vitriol in peony water, purgative pills, and a clyster. Followed by another clyster after two hours. Then syrup of blackthorn, more antimony, and rock salt. Next, more laxatives, white hellebore root up the nostrils. Powdered cowslip flowers. More purgatives. Then Spanish Fly. They shaved his head and stuck blistering plasters all over it, plastered the soles of his feet with tar and pigeon-dung, then said good-night.
Tuesday. ten more ounces of blood, a gargle of elm in syrup of mallow, and a julep of black cherry, peony, crushed pearls, and white sugar candy.
Wednesday. Things looked good:: only senna pods infused in spring water, along with white wine and nutmeg.
Thursday. More fits. They gave him a spirituous draft made from the skull of a man who had died a violent death. Peruvian bark, repeatedly, interspersed with more human skull. Didn’t work.
Friday. The king was worse. He tells them not to let poor Nelly starve. They try the Oriental Bezoar Stone, and more bleeding. Dies at noon."
Saturday, September 03, 2016
Another Important Ranking
Ranking fans have a busy week ahead of them. On Tuesday the QS world rankings will be announced and results will probably start leaking on Sunday or Monday. Then there will be the Shanghai broad subject rankings.
Times Higher Education have promised a major revelation on Monday. I suspect that this might just be the top ten or twenty of the world rankings or a preview of their new US college rankings.
But this ranking might be more important. Hackerrank, "a platform that ranks engineers based on their coding skills and helps companies discover talent faster", has just published a ranking of countries according to the speed and accuracy with which developers can solve a variety of coding challenges.
China is first and Russia second.
The USA is 28th and the UK 29th. Eastern Europe and East Asia generally perform well.
For once, there is some fairly good news for Africa and the Muslim world: Turkey is 30th, Egypt 42nd, Bangladesh 44th and Nigeria 48th.
The top ten are
1. China
2. Russia
3. Poland
4. Switzerland
5. Hungary
6. Japan
7. Taiwan
8. France
9. Czech republic
10. Italy
Tuesday, August 30, 2016
The Pursuit of Excellence
If you are wondering how they did it, see the story in the Hindustan Times.
"The Institute of Excellence and Higher Education (IEHE) in Bhopal improved its teacher and student ratio from 1:47 to 1:24 a day before the National Assessment and Accreditation Council (NAAC) team was scheduled to visit to retain the institute’s Grade ‘A’.
A three-member NAAC team, led by former vice chancellor SK Singh, will reach on Monday and inspect the institute in 24 sessions.
IEHE, which was facing hardships due to shortage of teachers, appointed 54 guest faculties in a week. The strength of teachers increased from 58 to 112."
There is nothing very unusual about this sort of thing. There have been, for example, suspicions about some British universities offering "relatively short- term contracts" that expire just after the Research Excellence Framework (REF) assessment is completed.
Tuesday, August 23, 2016
The Naming of Universities
You can tell a few things about universities from their names. If, for example, a university has a direction in its name then it is usually not ranked very highly: University of East London, Southern Illinois University. Those that are named after long dead people -- Yale, Duke -- are often but not always -- Bishop Grosseteste -- very prestigious.
It might be an idea to have a ranking of institutions with the most interesting or strangest names. After all nearly everything else about higher education is ranked somewhere or other. California University of Pennsylvania should be near the top. And of course there is Hamburger University and Butler University. Or, from a few years ago, The Universal Institute of Food Management and Hotel Administration I was Called by the Almighty and I Said Yes My Lord, which was actually a restaurant somewhere along the road from Maiduguri to Kano in Nigeria.
Another high flier might be the Lovely Professional University, a "semi-residential university college in North India", which is ranked 4326th in the world and 213rd in India by Webometrics. I doubt if it will get many votes in the QS or THE academic surveys unless it changes its name which I suspect might be a literal translation from Hindi or Sanskrit.
Sunday, August 21, 2016
Worth Watching
Video
Salvatore Babones, Gaming the Rankings Game: University Rankings and the Future of the University
Thursday, August 18, 2016
Shanghai Rankings 2016 Update
An interesting tweet from qwertzuiop reports that the average change in rank position in this year's Shanghai rankings was 32 compared to 11.7 between 2014 and 2015. Changes in methodology, even simple ones, can lead to a lot of churning.
Meanwhile, here are the correlations between the various indicators in the ranking. In general, it seems that the indicators are not measuring exactly the same thing and they do not raise red flags by showing a low or zero association with each other.
The lowest correlations are between publications and alumni and award (alumni and faculty winning Nobel and Fields awards). Publications are papers in the Science Citation Index and the Social Science Citation Index in 2015 while the alumni and award indicators go back several decades. Time makes a difference and as a measure of contemporary research excellence Nobel and Fields awards may be losing their relevance.
PCP | |||||||
All correlations are significant at the 0.01 level (2 tailed).
N is 500 in all cases except for Nature and Science where it is 497
Wednesday, August 17, 2016
The Shanghai Rankings: More Interesting This Year
The Shanghai rankings are usually the most stable and therefore the least interesting (for journalists, politicians and bureaucrat) of the current array.
This year, however, they are quite volatile. The reason for that is that the Shanghai Ranking Consultancy has completed the transition from the old to the new lists of highly cited researchers supplied by Thomson Reuters. In 2014 and 2015 they used both lists with an equal weighting which has reduced the abruptness of the transition. In addition, the rankings now count only primary affiliations. As a result there have been enough ascents and descents to gladden the hearts of higher education journalists and experts.
It should be noted that the effect of this is largely to accelerate trends that were in progress anyway. The old list was clearly out of date and it was time for a new one.
First, to my predictions. Harvard is still number one. Wisconsin at Madison, Rutgers and Virginia Polytechnic Institute have all fallen. Aalborg, Nanyang Technological University, Peking, Chiba and Tsinghua have risen. Peking and Tsinghua are now in the top 100 and heading for the top 50.
But the University of Tehran has not risen. It has fallen by 84 places, presumably because it lost a highly cited researcher during the second half of 2015.
Overall, the rankings provide more evidence for the rise of China with two universities in the top 100 and 54 in the top 500 compared with none and 44 last year, but not the rest of Asia. South Korea has gone from 12 in the top 500 to 11, Japan from 18 to 16 and Israel 6 to 5. India still has only one representative in the top 500 and Malaysia two.
Meanwhile the USA now has 137 universities in the top 500 compared with 146 last year
Rapidly rising institutions include Toulouse School of Economics, from 375th to 265th, largely because of being given a free pass this year for papers in Nature and Science, University of the Witwatersrand from 244th to 204th, University of Queensland from 77th to 55th and , King Abdullah University of Science and technology from 352nd to 254th.
Kwazulu-Natal has fallen from 413th to 494th, Dartmouth College from 215th to 271st and Universiti Malaya from 353rd to 413th.
Saturday, August 13, 2016
My Predictions for the Shanghai Rankings
Watch this Nate Silver.
In the latest edition of the Shanghai Academic Ranking of World Universities (ARWU) to be announced on Monday, the first place will go to Harvard.
The methodology for this prediction is based on an extremely complex and sophisticated algorithm that incorporates a large number of variables and will remain a secret for the moment.
Now for some easier predictions.
The Shanghai rankings are generally famous for their stability and consistency which makes them rather boring for journalists and naive administrators. No shocking headlines about catastrophic plunges in the rankings after the latest vandalism by government Scrooges.
But the Shanghai rankers have had problems with their highly cited researchers indicator. Thomson Reuters have stopped adding to their old list of highly cited researchers and have published a new one. The Shanghai Ranking Consultancy combined the two lists in 2014 and 2015 and have said that this year only the new list will be used in calculating the overall score.
Last year Shanghai published a list of scores for the highly cited indicator in 2013 (the old list), combined scores in 2015 and scores if the new list alone had been used.
Here are the universities that will rise or fall by ten points (two points in the weighted overall rankings) as the new list replaces the combined lists. This assumes that universities have not recruited or lost highly cited researchers dring 2015. If they have then the predictions will be incorrect. Also, changes in the highly cited indicator may be balanced by changes in other indicators
Predicted to Fall
University of Wisconsin, Madison
Rutgers: State University of New Jersey
Virginia Polytechnic Institute
Predicted to Rise
Aalborg University
Nanyang Technological University
Peking University
Chiba University
Tsinghua University
University of Tehran
Thursday, August 11, 2016
Value Added Ranking
There has been a lot of talk about ranking universities by factors other than the usual mix of contributions to research and innovation, reputation surveys and inputs such as spending, teaching resources or student quality.
The emerging idea is that universities should be assessed according to their ability to teach students or to inculcate desirable skills or attributes.
Much of this is powered by the growing awareness that American and European secondary schools are failing to produce sufficient numbers of students with the ability to undertake and complete anything that could realistically be called a university education. It is unlikely that this is the fault of the schools. The unavoidable verdict of recent research is that the problem with schools has very little to do with institutional racism, a lack of grit, resilience or the current X factor or the failure to adopt Finnish, Chinese or Singaporean teaching methods. It is simply that students entering the school system are on average less intelligent than they were and those leaving are consequently also less intelligent.
There is now a market for rankings that will measure the quality of universities not by their resources, wealth or research output but by their ability to add value to students and to prepare them for employment or to enable them to complete their courses.
This could, however, lead to massively perverse consequences. If universities are assessed according to the percentage of entrants who graduate within a certain period or their employability then there could be a temptation to dilute graduation requirements .
Nevertheless, the idea of adding value is one that is clearly becoming more popular. It can be seen in the attempt to introduce a national rating system in the US and in the UK to use the proposed Teaching Excellence Framework (TEF) to rank universities.
One UK ranking that includes a value added measure is the Guardian University Guide. This includes eight indicators, three of which three of which measure student satisfaction. Other indicators are staff student ratio and spending per student. There is also a measure of student outcomes, that is graduate level employment or entry into a postgraduate course after six months, one of the quality of students measured by A level qualifications and one a measure of value added, that is the difference between the students entry level exam results and their eventual degree results.
It is therefore possible to get a rough idea of what factors might actually produce positive student outcomes.
The overall ranking for 2015-16 starts by being quite conventional with the top three places going to Cambridge, Oxford and St Andrews. Some might be surprised by Exeter in 9th place and Loughborough in 11th, ahead of LSE and UCL.
Measuring student quality by exam scores produces unsurprising results at the top. Cambridge is first followed by Oxford and Imperial. For staff student ratio the top three are UCL, Oxford and SOAS and for spending per student Oxford, Cambridge and the University of the Arts London.
For student satisfaction with courses, Bath, Keele and UEA are in the lead while Oxford is 5th and Cambridge 12th. It's when we look at the Value Added that we find some really unusual results. The top three are Gloucester, Edinburgh and Abertay.
After plugging the indicator scores into an SPSS file we can calculate the correlations between the desired outcome, that is graduate level employment or postgraduate study and a variety of possible associated factors.
Here in descending order are the correlations with career prospects:
average entry tariff .820
student staff ratio .647
spending per student .569
satisfaction with course .559
satisfaction with teaching .531
value added .335
satisfaction with feedback -.171.
It would seem that if you want to know which university is best for career prospects then the most important piece of data is the average academic ability of the students. The student staff ratio and money spent are also significant as is satisfaction with courses and teaching.
The correlation between value added and career prospects is much less and rather modest.
The universities were divided into thirds according to average entry tariff. In the top third of universities there was a strong correlation between career prospects and average entry level tariff, .628, and a modest one with spending, .355. Nothing else was associated with career success.
In the middle third the factor most associated with career prospects was course satisfaction, .498, followed by average entry tariff, .449, staff student ratio, .436, and satisfaction with teaching, .362. Satisfaction with feedback and value added were insignificant.
However, for the least selective third of universities, the picture was rather different. The factor most strongly associated with career success was satisfaction with feedback, .493, followed by valued added, .479, course satisfaction, .470, satisfaction with teaching, .439, and average entry tariff, .401. The relationship with spending and staff student ratio was insignificant .
The evidence of the Guardian rankings is that value added would only be of interest to students at or applying to the least selective third of UK universities. For the rest it is of no importance. It is debatable whether it is worth making it the centre of a new set of rankings.
Worth Reading 7
Richard Holmes
ABSTRACT
This
paper analyses the global university rankings introduced by Times Higher Education (THE) in partnership with Thomson Reuters in
2010 after the magazine ended its association with its former data provider Quacquarelli Symonds. The distinctive features of the new
rankings included a new procedure for determining the
choice and weighting of the various indicators, new criteria for inclusion in
and exclusion from the rankings, a revised academic reputation survey, the
introduction of an indicator that attempted to measure innovation, the addition
of a third measure of internationalization, the use of several indicators
related to teaching, the bundling of indicators into groups, and, most
significantly, the employment of a very distinctive measure of research impact
with an unprecedentedly large weighting. The rankings met with little
enthusiasm in 2010 but by 2014 were regarded with some favour by administrators
and policy makers despite the reservations and criticism of informed observers
and the unusual scores produced by the citations indicator. In 2014, THE announced that the partnership would
come to an end and that the magazine would collect its own data. There were
some changes in 2015 but the basic structure established in 2010 and 2011
remained intact.
Forthcoming in Asian Journal of University Education, December 2015. Prepublication copy can be accessed here.
Wednesday, August 03, 2016
Ghost Writers
An article by Chris Havergal in Times Higher Education reports on research by Lisa Lines , a lecturer at the University of New South Wales, in Teaching in Higher Education (behind a pay wall) that suggests that the output of ghost written student essays is probably greater than expected.
The researcher had ordered undergraduate and master's essays in history and then had them marked by "leading academics." Of the 13 undergraduate essays only two received a failing grade while six out of 13 master's failed and seven passed
Lines says that the quality of the purchased essays was surprisingly high.
Possibly. Or you could say that the standard of marking was surprising low. Note that this was at a university that is in the top 150 in the world according to ARWU.
Havergal quotes Lines as saying:
“It is clear that this type of cheating is virtually undetectable by academics when students take precautions against being caught,” she concludes.
“This fact, coupled with the study’s findings that the quality of essays available for purchase is sufficient to receive a passing grade or better, reveals a very troubling situation for universities and poses a real threat to academic integrity and standards, and public perceptions of these.”
The problem lies not with dishonest student or crooked essay writers but with corrupt selection practices that admit academically incompetent students and a dysfunctional employment system. If you have students that cannot write and intelligent graduates who cannot find work then ghost writing is inevitable.
Monday, July 25, 2016
Looks Like THE isn't Big in Japan Anymore
It has taken a long time but it seems that Japanese universities are getting a little irritated about the Times Higher Education (THE) World and Asian University Rankings.
I have commented on the THE Asian rankings here, here and here.
According to the Nikkei Asian Review,
Research University 11, a consortium of Japan's top 11 universities, issued a statement earlier this month that the Times Higher Education ranking should not be used to determine national policy or as an achievement indicator.
Another umbrella group, the Research University Network of Japan, which includes universities and research institutions, has opposed the ranking every year Japanese universities have taken big tumbles.
and
So achieving a higher ranking does not necessarily correlate with providing better educations and research opportunities.
For some universities, there is another worry -- politics. The Japanese government in 2013 said it would aim to ensure that Japanese universities rank among the world's top 100 over the following decade. Now, Japanese universities are required to develop specific strategies to help the government reach this "revitalization" goal.
Saturday, July 23, 2016
Another Important Ranking
Another ranking that should be looked at very carefully is the International Mathematical Olympiad, designed for pre-university students, the first of which was held in Romania in 1959. The competition includes problems in algebra, pre-calculus, complex geometry and functional equations.
Twenty years ago the Olympiad was dominated by ex-communist Eastern Europe. In 1996, first place was taken by Romania while Hungary was third and Russia fourth. Now, East Asia and the Chinese diaspora are dominant: South Korea second, China third, Singapore fourth, Taiwan fifth, North Korea sixth, Hong Kong ninth, Japan tenth.
The USA is first this year, as it was in 2015, with an all-male team whose members have three South Asian and three Chinese surnames.
The rankings look pretty much like the PISA and TIMSS test scores. Combined with the recent coding competition and the Top500 supercomputing ranking, they suggest the intellectual and economic leaders of this century will be in East Asia and Eastern Europe including Russia.
The USA and the UK might do fairly well if they can introduce and maintain sensible immigration and educational selection policies.
The American success, unfortunately, is not good enough for the conventional education media. The team is not diverse enough: no women, no historically underrepresented minorities. So far nobody has protested about the absence of transgender or openly gay students but perhaps their time will eventually come.
Education Week reports that:
"According to Mark Saul, the director of competitions for the Mathematical Association of America, not a single African-American or Hispanic student—and only a handful of girls—has ever made it to the Math Olympiad team in its 50 years of existence."
To overcome this problem, participants in the events leading up to the Olympiad have competitions that test creativity and collaboration and are judged subjectively.
"In the past few years, MathCounts added two new middle school programs to try to diversify its participant pool—National Math Club and the Math Video Challenge.
"Schools or teachers who sign up for the National Math Club receive a kit full of activities and resources, but there's no special teacher training and no competition attached.
The Math Video Challenge is a competition, but a collaborative one. Teams of four students make a video illustrating a math problem and its real-world application.
After the high-pressure Countdown round at this year's national MathCounts competition, in which the top 12 students went head to head solving complex problems in rapid fire, the finalists for the Math Video Challenge took the stage to show their videos. The demographics of that group looked quite different from those in the competition round—of the 16 video finalists, 13 were girls and eight were African-American students. The video challenge does not put individual students on the hot seat—so it's less intimidating by design. It also adds the element of artistic creativity to attract a new pool of students who may not see themselves as "math people."
An 8th grade team from the Ron Clark Academy, an independent middle school in Atlanta that serves low-income students, was among the finalists. The students illustrated a complicated multistep problem entirely through rap. None had ever been involved in a math competition before."
In other words, the competitions will be less and less about mathematics and more and more about making rap videos and the like. No doubt Russia, China and Korea will be flocking to the US to see how its done. Much the same thing has been happening with national competitive debating.
Here are this year's results and those for 2015 and 1996.
Rank 2016
|
Team
|
Rank 2015
|
Rank 1996
|
1
|
USA
|
1
|
2
|
2
|
South Korea
|
3
|
8
|
3
|
China
|
2
|
6
|
4
|
Singapore
|
10
|
25
|
5
|
Taiwan
|
18
|
20
|
6
|
North Korea
|
4
|
--
|
7=
|
Russia
|
8
|
4
|
7=
|
UK
|
22
|
5
|
9
|
Hong Kong
|
28
|
27
|
10
|
Japan
|
22
|
11
|
11
|
Vietnam
|
5
|
7
|
12=
|
Canada
|
9
|
16
|
12=
|
Thailand
|
12
|
47
|
14
|
Hungary
|
20
|
3
|
15=
|
Brazil
|
22
|
52
|
15=
|
Italy
|
29
|
25
|
17
|
Philippines
|
36
|
74
|
18
|
Bulgaria
|
29
|
11
|
19
|
Germany
|
27
|
10
|
20=
|
Romania
|
13
|
1
|
20=
|
Indonesia
|
29
|
70
|
22
|
Israel
|
40
|
15
|
23
|
Mexico
|
19
|
53
|
24
|
Iran
|
7
|
9
|
25=
|
Australia
|
6
|
23
|
25=
|
France
|
14
|
32
|
25=
|
Peru
|
16
|
--
|
28
|
Kazakhstan
|
28
|
25
|
29
|
Turkey
|
20
|
19
|
30=
|
Armenia
|
26
|
34
|
30=
|
Croatia
|
15
|
34
|
30=
|
Ukraine
|
11
|
18
|
33
|
Mongolia
|
35
|
44
|
34
|
India
|
34
|
37
|
35=
|
Bangladesh
|
33
|
--
|
35=
|
Belarus
|
39
|
21
|
37=
|
Czech Republic
|
45
|
28
|
37=
|
Sweden
|
60
|
40
|
39
|
Macau
|
35
|
48
|
40
|
Serbia
|
40
|
29
|
41
|
Saudi Arabia
|
41
|
--
|
42
|
Poland
|
17
|
13
|
43
|
Switzerland
|
45
|
62
|
44
|
Netherlands
|
43
|
59
|
45
|
Bosnia - Herzogovina
|
43
|
57
|
46
|
Austria
|
60
|
42
|
47
|
Portugal
|
52
|
13
|
48
|
Syria
|
54
|
--
|
49
|
Spain
|
72
|
48
|
50=
|
Lithuania
|
65
|
32
|
50=
|
Greece
|
51
|
22
|
52
|
Belgium
|
56
|
31
|
53
|
New Zealand
|
49
|
--
|
54
|
Azerbaijan
|
48
|
58
|
55
|
Slovakia
|
33
|
17
|
56
|
Malaysia
|
57
|
72
|
57
|
Argentina
|
52
|
29
|
58
|
South Africa
|
55
|
43
|
59=
|
Costa Rica
|
67
|
--
|
59=
|
Georgia
|
42
|
30
|
61
|
Estonia
|
70
|
55
|
62
|
Tajikistan
|
64
|
--
|
63=
|
Moldova
|
38
|
41
|
63=
|
Slovenia
|
73
|
44
|
63=
|
Cyprus
|
63
|
69
|
66=
|
Sri Lanka
|
70
|
53
|
66=
|
Colombia
|
49
|
46
|
68
|
El Salvador
|
95
|
--
|
69=
|
Albania
|
77
|
67
|
69=
|
Turkmenistan
|
58
|
72
|
71=
|
Finland
|
82
|
39
|
71=
|
Paraguay
|
67
|
--
|
73
|
Macedonia
|
74
|
46
|
74
|
Latvia
|
79
|
33
|
75
|
Ireland
|
77
|
61
|
76
|
Tunisia
|
75
|
--
|
77=
|
Kosovo
|
86
|
--
|
77=
|
Uzbekistan
|
58
|
--
|
79
|
Morocco
|
80
|
65
|
80
|
Nicaragua
|
82
|
--
|
81
|
Denmark
|
69
|
48
|
82
|
Algeria
|
62
|
--
|
83
|
Ecuador
|
80
|
--
|
84=
|
Kyrgyzstan
|
92
|
67
|
84=
|
Norway
|
65
|
37
|
86
|
Venezuela
|
96
|
--
|
87
|
Puerto Rico
|
90
|
--
|
88=
|
Montenegro
|
89
|
|
88=
|
Nigeria
|
88
|
|
90
|
Iceland
|
75
|
56
|
91=
|
Chile
|
97
|
71
|
91=
|
Pakistan
|
85
|
--
|
93
|
Uruguay
|
93
|
--
|
94
|
Trinidad & Tobago
|
82
|
60
|
95
|
Luxemburg
|
97
|
--
|
96=
|
Cambodia
|
86
|
--
|
96=
|
Myanmar
|
--
|
--
|
98
|
Uganda
|
100
|
--
|
99
|
Kenya
|
--
|
--
|
100=
|
Honduras
|
--
|
--
|
100=
|
Madagascar
|
--
|
--
|
102
|
Jamaica
|
102
|
--
|
103
|
Botswana
|
103
|
--
|
104=
|
Egypt
|
||
104=
|
Ghana
|
101
|
--
|
106
|
Tanzania
|
106
|
--
|
107=
|
Iraq
|
--
|
--
|
107=
|
Liechtenstein
|
90
|
--
|
109
|
Laos
|
--
|
--
|
Subscribe to:
Posts (Atom)