Monday, May 23, 2016

Does a programming competition in Thailand show the future of the world economy?



The ACM ICPC (Association for Computing Machinery -- Intercollegiate Programme) is the "Olympics of Programming Competitions". The competitors are teams of university students who grapple with complex real-world problems. It is a "battle of logic, strategy and mental endurance" and is the apex of a series of local and regional competitions.

Success in the competition requires a high level of intelligence, genuine team formation and rigorous training. It is the antithesis of the intellectual levelling and  narcissistic cult of safe spaces that has infected American and, perhaps to a lesser extent, British universities.

The finals have just been completed in Thailand. The top five are:

1.  St. Petersburg State University
2.  Shanghai Jiao Tong University
3.  Harvard University
4.  St. Petersburg Institute of Physics and Technology
5.  University of Warsaw

The list of universities in the top ten, the top fifty and the total number of finalists is interesting. If this competition reflects the current level of intelligence of university students, then the future for China, patches of the rest of Asia, and Russia and Eastern Europe looks bright. The USA may do well if -- a very big if -- it can continue to attract large numbers of Chinese students and immigrants. For Africa and Western Europe, including the UK, the economy of the 21st century may be bleak.

Below, countries are ranked according to the number of universities in the top ten, the top fifty and all finalists.


Rank
Country
Top 10
Top 50
Total Finalists
1
Russia
5
10
12
2
USA
2
6
23
3
Poland
2
3
3
4
China
1
10
17
5
Brazil

2
6
6
Japan

2
4
7
Ukraine

2
3
8=
Belarus

2
2
8=
Taiwan

2
2
10
Bangladesh

1
3
11=
Canada

1
2
11=
Iran

1
2
11=
South Korea

1
2
11=
Vietnam

1
2
15=
Argentina

1
1
15=
Croatia

1
1
15=
Finland

1
1
15=
Hong Kong

1
1
15=
North Korea

1
1
15=
Singapore

1
1
21
India


6
22
Egypt


4
23=
Mexico


3
23=
Syria


3
25=
Australia


2
25=
Colombia


2
25=
Netherlands


2
28=
Chile


1
28=
Cuba


1
28=
Czech Republic


1
28=
Jordan


1
28=
Macao


1
28=
Pakistan


1
28=
Peru


1
28=
Philippines


1
28=
Slovakia


1
28=
South Africa


1
28=
Spain


1
28=
Switzerland


1
28=
Thailand


1
28=
UK


1
28=
Venezuela


1

Sunday, May 22, 2016

Don’t rush to conclusions from the THE rankings

My 15th May post has been republished in University World News with a different title.

No need for British universities to worry about their dip in THE reputation rankings

Don’t rush to conclusions from the THE rankings

Sunday, May 15, 2016

One more thing about the THE reputation rankings

I have just remembered something about the THE reputation reputation rankings that is worth noting.

THE have broken out the scores for teaching reputation and research reputation for the first fifty universities and this gives us a chance to ask if there is any meaningful difference between teaching and research reputation.

The answer is that there is not. The correlation between the teaching and the research scores is .986. This is so high that for practical purposes they are exactly the same thing. The 15% weighting given for teaching (actually "postgraduate supervision") reputation may be unrelated to undergraduate teaching or even to taught master's teaching.The emphasis on research in the THE world rankings is therefore even higher than THE claim, at least at the top.

This has already been pointed out by Alex Usher of High Education Strategy Associates of Canada who found a correlation of .991 in 2013.

The THE reputation rankings: Much ado about not very much

Every so often, especially in North America and Western Europe, there is a panic about the impact of government policies on higher education, usually the failure to provide as much money as universities want, or sometimes as many overseas students as they need to fill lecture halls or cover budget deficits. Global university rankings have a lot to do with the onset and spread of these panics.

True to form, the British  "quality" media have been getting into a tizzy over the latest edition of the Times Higher Education (THE) world reputation ranking. According to Javier Espinoza, education editor of the Telegraph, top UK universities have been under pressure to admit minority and state school students and have also had difficulty in recruiting foreign students. This has somehow caused them to forget about doing research or teaching the most able students. It seems that academics from countries around the world, where such problems are of course unknown, are reacting by withholding their votes from British universities when responding to the THE survey and transferring their approval to the rising stars of Asia.

This supposedly has caused UK institutions to slide down the rankings and two of them, Bristol and Durham, have even dropped out of the top 100 altogether into the great dark pit of the unranked.

The Guardian notes that Oxford and Cambridge are falling and are now only just in the world's top five while the Independent quotes Phil Baty, as saying that "our evidence - from six massive global surveys over six years, including the views of more than 80,000 scholars - proves the balance of power in higher education and research is slowly shifting from the West to the East". 

This, it would seem, is all because of cuts in funding and restrictions on the entry of overseas students and faculty.

All this is is rather implausible. First of all, these are reputation rankings. They refer only to one indicator that accounts for 33 percent of the World University Rankings that will appear later this year. It is not certain that the other indicators will go in the same direction.

Secondly, these rankings have not been standardised as they will be when included in the world rankings, which means that the huge gap between the Big Six, Harvard -- MIT, Berkeley, Stanford, Oxford and Cambridge -- and the rest is laid bare, as it will not be in the autumn, and so we can get a rough idea of how many academics were voting for each university. A crude guess is that when we get down to around 50th place the number of votes will be around five hundred and even less when we reach 100th place.

This means that below the 50 mark a shift in the opinion of a few dozen respondents could easily push a university up or down into a new band or even into or out of the top 100.

Another thing we should remember is that the expertise of the researchers in the Scopus database, from which respondents are drawn, is  exaggerated. The qualification for receiving a survey form is being the corresponding author of a publication listed in the Scopus database. There is much anecdotal evidence that in some places winning research grants or getting the corresponding author slot has more to do with politics than with merit. The THE survey is better than QS's, which allows anyone with an academic email address to take part, but it does not guarantee that every respondent is an unbiased and senior researcher.

We should also note that, unlike the US News and QS survey indicators, THE takes no measures to damp down year to year fluctuations. Nor does it do anything to prevent academics from supporting their own universities in the survey.

So, do we really need to get excited about a few dozen "senior researchers" withdrawing their support from British universities?

The credibility of these rankings is further undermined by apparent changes in the distribution of responses by subject group. According to the methodology page in Times Higher Education for 2015, 16% of the responses were from the arts and humanities and 19% were from the social sciences, which in that year included business studies and economics. This year, according to the THE methodology page, 9% of the responses were from the arts and humanities and 15 % were from the social sciences and 13 % were from business and economics, adding up to 28%.

In other words the responses from the arts and humanities have apparently fallen by 7 percentage points, or around 700 responses, and the combined responses from social sciences and business and economics have apparently risen by nine points, or about 900 responses.

If these numbers are accurate then there has been among survey respondents a very substantial shift from the arts and humanities to the social sciences (inclusive of business and economics) and it is possible that this could be sufficient to cause the recorded decline in the reputation scores of British universities which usually do much better  in the arts and humanities than in the social sciences.

In the THE subject group rankings last year, Durham, for example, was 28th for arts and humanities in the THE 2015-16 World University Rankings and 36th for the social sciences. Exeter was 71st for arts and humanities and 81st for the social sciences.

At the same time some of those rising  Asian universities were definitely  stronger in the social sciences than in the humanities: Peking was 52nd for social sciences and 84th for arts and humanities, Hong Kong 39th for social sciences and 44th for arts and humanities, Nanyang Technological University 95th for social sciences and outside the top 100 universities for the arts and humanities.

It is possible that such a symmetrical change could be the result of changes in the way disciplines are classified or even a simple transposition of data. So far, THE have given no indication that this was the case.

It is interesting that an exception to to the narrative of British decline is the London Business School which has risen from the 91-100 band to 81-90.

The general claim that the views of 80,000 academics over six years are evidence of a shift from west to east is also somewhat tenuous. There have been several changes in the collection and organisation of data over the last few years that could affect the outcomes of the reputation  survey.

Between 2010-2011 and 2016 the percentage of responses from the social sciences (originally including  business and economics) has risen from 19% to 28 % for social sciences plus business and economics counted separately. Those for clinical and health sciences and life sciences  have fallen somewhat while there has been a slight rise for the arts and humanities, with a large spike in 2015.

The number of responses from the Asia Pacific region and the Middle East has has risen from 25% to 36% while those from the Americas (North and Latin) have  fallen from 44% to 25%. The number of languages in which the survey is administered has increased from eight in 2011 to fifteen this year.

The source of respondents has shifted from the Thomson Reuters Web of Science to Scopus, which includes more publications from languages other than English.

The value of these changes is not disputed here but they should make everybody very cautious about using the reputation rankings to make large claims about what is happening to British universities or what the causes of their problems are.




Friday, May 13, 2016

Conference Announcement






More Information Here

Wednesday, May 11, 2016

Does FIFA need help from university rankers?




At the recent International Rankings Expert Group (IREG) conference at the New University of Lisbon, Simon Marginson of the London Institute of Education raised some chuckles by imagining what would happen if the winner of the football World Cup was decided, as the world's top universities often are, by multi-indicator rankings.

He suggested that if the winner of the World Cup was determined by the methods of global university rankings then there could be a 50% weighting for goals scored, 20% for the size of the teams' fan bases, 10% for player endorsement and 20% for media coverage.

This seems a little conservative. Here are some more ideas about improving the World Cup by incorporating the insights and methods of leading global university rankings.

Perhaps we could have an indicator for the Top Party Team, one based on players' salaries or transfer fees or inlinks to the teams' websites.

Another possibility would be diversity, ethics or sustainability-based ranking indicators perhaps rewarding teams according to their gender diversity or players riding bicycles to training or doing community service.

An interesting approach would be to  combine goals scored during the tournament with those in previous competitions going back to the earliest games. FIFA could also include an indicator that would allow players to move from one national team to another and then transfer their goals scored to the new team. This proposal might be impractical because of possible perverse incentives.

There was an interesting idea about university rankings from a Russian social science school based on a brilliant and highly distinctive and innovative methodology.  This could be applied to the World Cup although it would give Nepal or Brunei the same chance of winning as Brazil or Germany.

Alternatively FIFA might consider ranking teams by using reputation surveys with five or six different channels, internationalisation (percentage of players signed by clubs in other countries), ratio of administrators to players and the number of times each player is mentioned in the sports media.

However, the the best approach for FIFA might be to adopt the methodology of the "gold standard" of university ranking. The flagship indicator would be based on the insight that goal scoring habits and practices vary from country to country and from match to match and that counting the number of goals scored in a game in an absolute manner is a "mortal sin". So in each match the result would be adjusted according to whether goals resulted from a penalty or free kick, the number of defenders between the striker and the goalkeeper, distance from the goal, wind direction, the altitude of the ground, and hostility or support from spectators.

Added to this would be a regional modification whereby the number of goals scored would be divided by the square root of the average number of goals scored in a game by teams from the region where the country is located. This modification is necessary because footballers in some countries do not have the opportunity to network with players in other countries.

Going off topic a bit, maybe  FIFA could ask the Oscar awards to introduce best football actor awards for skills in faking injuries, diving in the penalty areas or saying that it was all due to team spirit and the support of our wonderful fans.




Saturday, April 23, 2016

America's Sleepiest Colleges



Babies, Three, Sleep, Eyes Closed
A ranking from JAWBONE reproduced at THE Tab lists American colleges and universities according to the hours of sleep that students get. Apparently there is a significant correlation between the ranking of the institution according to the US News and late weekday bedtime.

Here are the top ten universities according to the number of hours slept per night (inverting the JAWBONE list).

1.   Colorado
2=  Tulane
2=  Vermont
4.    Auburn
5.    Arkansas
6=  South Carolina
6=  Iowa
8=  Brown
8=  Florida State
8=  Clemson

Please do not share this too much or demented bureaucrats will start incorporating caffeine injections into their rankings strategy.

Friday, April 22, 2016

Aussies are falling out of love with THE



Matthew Knott in the Sydney Morning Herald reports that Australian education experts are no longer impressed by the Times Higher Education World University Rankings.

"Grattan Institute higher education program director Andrew Norton says The Times rankings are not "terribly high quality".

"They should not be used as a guide for which university to go to and they shouldn't be used as a guide to higher education policy," he says.

In particular, he warns that movements up or down the league table – especially small ones – should not be used a reliable verdict on whether a university is improving or declining.

And he's not alone.

Australian higher education academic Simon Marginson, one of the leading experts on university rankings, is even more damning.

"In social science terms they are rubbish," he told an academic conference last year."

THE will probably not be bothered too much. After all, when you have been declared "education secretary of the world" in China, who cares if an Australian journalist compares you to dead sticks? 

No more excellence at the University of Missouri




Image result for free image university missouri


No, it is not a new initiative to identify more inclusive measures of achievement.


From Peter Woods at Minding the Campus:

"The University of Missouri has eliminated Respect and Excellence.  I have to write this in a hurry because it won’t be long before others will seize on this gift.  Respect and Excellence are the names for two residence halls at the University.  They are being closed because the University suddenly finds that its enrollments are plummeting.  Two other dorms were closed already in light of the crisis.
Let’s bask in the irony for a moment or two longer.  The University of Missouri arrived at this juncture by cravenly submitting to the demands of activists and the threats of football players who decided to abet the activists.  On November 9, System President Tim Wolfe and Chancellor R. Bowen Loftin resigned rather than face down those threats."
There is definitely something wrong with the administration if they did not anticipate the reaction to this.

Trinity College Dublin: A Case of Rankings Abuse



Trinity College Dublin (TCD) is giving itself a public flogging over its fall in the rankings. This is rather odd since in fact it has been doing pretty well over the last few years with one exception.

An article in the Irish Times reveals that the Provost of TCD, Patrick Prendergast, is planning a new business school and student accommodation but finds it difficult to raise the necessary cash.


"The college’s planned expansion comes at a time when many in higher education say the sector faces a funding crisis. Rising student numbers, declining state funding and restrictions on staff recruitment mean that many have had to make dramatic cuts to make ends meet.
Trinity and some of the bigger universities have at least been able to plug many of the funding gaps with private income, such as international students’ fees, research and other commercial sources."
According to the provost things are getting pretty rough.
“We have overcrowded classrooms. Our staff-student ratio is a way out of kilter. The universities have been very resilient; they have managed to keep going successfully. But is it sustainable? I and other university presidents don’t think it is sustainable at current funding levels.”
What does this have to do with rankings.?
"University rankings are considered vital to attracting international students and research funding. However, Trinity, like many Irish universities, has slid down world rankings in recent years as it copes with an increase in students and a reduction in funding.
Last month, TCD found itself at the centre of controversy when one of the main ranking agencies, QS, accused the college of violating its rules by influencing academics involved in its annual survey."
The provost claims:


“If Ireland really wants to be an island known for the talent of its people, and have companies locate here, then we can’t afford to have that one global indicator of the quality of education systems – rankings – decline."

But is it true that TCD is sliding down the rankings?

Let's take a look at the Shanghai Academic Ranking of World Universities (ARWU). This measures research in the natural and social sciences at the highest level. The methodology of these rankings has remained largely stable with some minor exceptions.  In 2004  there were some changes to help social science institutions a bit and in 2014 they began to use a new list of highly cited researchers. The latter move helped TCD a bit in 2014 and, assuming the highly cited researchers do not leave, will help a bit more later this year. 

Therefore, if there has been a significant change in scores in these rankings it is a reasonable assumption that this does actually reflect a change in quality.

Looking at the scores for the separate indicators in the Shanghai rankings, the score for alumni with Nobel and Fields awards fell from 15.4 to 10.3, for faculty with awards from 14.4 to 13.3, and for papers in Nature and Science very slightly from 13.2 to 13.1 between 2004 and 2015.

In contrast, TCD did much better for highly cited researchers (from zero to 12.3), publications in Web of Science journals (27.1 to 31.0) and productivity per capita (the sum of the above indicators divided by number of faculty). For all criteria, the top scoring university, Caltech for productivity and Harvard for everything else, gets 100.

Overall, TCD moved from the 202-300 band to the 151-200 band. A rough guess is that TCD has moved up  about 15 places altogether. About five of these places were gained because of the new highly cited researchers list. At this rate it could be in the top 100 in another century or so. That is very long term but as Basil Fawlty said, "these things take time".

Of course, the Shanghai rankings do not measure teaching, quality of students, income or internationalisation. For a detailed look at a more diverse set of criteria we can turn to the Round University Rankings. These are published by a Russian company but use data from Thomson Reuters, the same source that Times Higher Education (THE) used from 2010 to 2014.  There are, however, 20 indicators compared to the THE's 13 and they cover the period from 2010 to 2016.

Overall, TCD improved significantly, going from 174th to 102th place.

In the teaching dimension, TCD rose slightly from 207th place to 197th, doing slightly better for two doctoral degree indicators, quite a bit worse for two academic staff indicators and staying in exactly the same place for teaching reputation (187th).

For research, TCD improved significantly from 2010 to 2016, going from 193rd to 67th, did much better for papers, citations, completed doctoral degrees and research reputation but did worse for normalised citation impact.

TCD did very well for international diversity in 2010 (31st) but slipped back a bit by 2016 (46th).

For financial sustainability, TCD 's relative position worsened considerably falling from 229th to 412th.

So the data from RUR suggests that TCD's income and faculty resources  were declining relative to other universities over this period.  But so far this has had no significant effect on TCD's teaching profile while research has improved noticeably both in quality and quality.

TCD could quite plausibly claim to be the university with a tiger in its tank, getting more research and educating more students with limited financial and faculty resources.

Turning to the QS scores, in 2004 TCD was ranked 87th in the world and by 2014 had risen to 71st place, but fell back to 78th in 2015. They were still doing better than in 2016. It is not a good idea to draw any conclusions from the decline between 2014 and 2015 because in 2015 QS introduced a number of substantial methodological changes.

To summarise, TCD were doing better over fairly long periods in  the Shanghai,  RUR and QS rankings. Possibly, the fall in the QS rankings between 2014 and 2015 might portend difficulty ahead and perhaps the fall in income documented in RUR may eventually have a knock on effect although so far it has not. Still, it seems that  TCD has on the whole  been doing well. So what  is the Provost talking about?

It is the THE rankings that appear to show TCD in a bad light. In 2010 TCD was in 76th place overall and by 2015 had fallen to 160th. It is difficult to tell exactly what happened because 11 of the 13 indicators are bundled into three super-indicators and it is not clear exactly what contributed to rising or falling scores. In 2015 TCD had higher scores for  international orientation and lower scores for Teaching, Research, Citations and Income from Industry. The biggest  decline was in the Research score, from 45.3 to 30.8.

Clearly, THE is the odd man out as far as TCD is concerned and should not be taken too seriously. Firstly, there were major methodological changes last year which produced upheavals for universities around the world, including those in France, Korea and Turkey. There was another batch of changes in 2011. In addition, these rankings generate a lot of noise because of exchange rate fluctuations, the use of surveys which can be quite volatile outside the top fifty or so, and a citations indicator where (until last year) a single paper or adjunct faculty could produce an enormous change in scores.

THE have said  -- and here they must be given credit -- that:  "Because of changes in the underlying data, we strongly advise against direct comparisons with previous years’ World University Rankings."

It seems that TCD is doing the academic equivalent of taking a dive.

Monday, April 18, 2016

Off Topic: The Independent Gets Really Creative

The Independent has just posted a creativity test composed of one question. I can remember this question from a non-examination social studies class in grammar school several decades ago.

Surely the voice of the British intelligentsia can be more creative than that.

Round University Rankings


The latest Round University Rankings have been released by the Russian company, RUR Rankings Agency. These are essentially holistic rankings that attempt to go beyond the measurement of research output and quality. There are twenty indicators, although some of them such as Teaching Reputation, International Teaching Reputation and Research Reputation and International Students and International Bachelors are so similar that the information they provide is limited.

Basically these rankings cover much the same ground as the Times Higher Education (THE) World University Rankings. The income from industry indicator is not included but there are an additional eight indicators. The data is taken from Thomson Reuters' Global Institutional Profiles Project (GIPP) which was used by THE for their rankings from 2010 to 2014.

Unlike THE, which lumps its indicators together into groups,  the scores in the RUR are listed separately in the profiles. In addition, the rankings provide data for seven continuous years from 2010 to 2016. This provides an unusual opportunity to examine in detail the development of universities over a period of seven years, measured by 20 indicators. This is not the case with other rankings which have fewer indicators or which have changed their methodology.

It should be noted that participation in the GIPP is voluntary and therefore the universities in each edition could be different. For example, in 2015 100 universities dropped out of the project and 62 joined.

It is, however,  possible to examine a number of claims that have been made about changes in university quality over the last few years. I will  take a look at these in the next few posts.

For the moment, here are the top five in the overall rankings and the dimension rankings.

Overall
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Chicago


Teaching
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Duke

Research
1.   Caltech
2.   Harvard
3.   Stanford
4.   Northwestern University
5.   Erasmus University Rotterdam

International Diversity
1.   EPF Lausanne
2.   Imperial College London
3.   National University of Singapore
4.   University College London
5.   Oxford

Financial Sustainability
1.   Caltech
2.   Harvard
3.   Scuola Normale Superiore Pisa
4.   Pohang University of Science and Technology
5.   Karolinska Institute

Unfortunately these rankings have received little or no recognition outside Russia. Here are some examples.


MIPT entered the top four universities in Russia according to the Round University Ranking

Russian Universities in the lead in terms of growth in the international ranking of Round University Ranking

TSU [Tomsk State University]  has entered the 100 best universities for the quality of teaching

[St Petersburg]

Russian universities to top intl rankings by 2020 – Education Minister Livanov to RT


Saturday, April 16, 2016

Some fairly new news from THE

The latest of many spin offs from the THE world university rankings is a list of 150 universities less than 50 years old. The data is extracted from the world rankings released at the end of last year. THE have, however, reduced the weighting given to their academic reputation survey and so the results are a little different.

Is it possible that THE are thinking about reducing the weighting for the reputation survey in this year's world rankings?

Here are the top ten new universities with their overall scores and then the overall scores in the World University Rankings in brackets. It can be seen that changing the weighting for this indicator does in some cases make a difference, although not a spectacular one.


Place
University
Overall score

1Ecole Polytechnique Federale Lausanne
Switzerland
76.8  (76.1)   
2Nanyang Technological University
Singapore
72.5  (68.2)
Hong Kong University of Science and Technology70.7  (67.2)
4Maastricht University
Netherlands 
66.1  (59.9)
5Pohang University of Science and Technology
Korea
65.5  (56.9)
6Korean Advanced Institute of Science and Technology60.8  (53)
7University of Konstanz
Germany
58.9  (50.8)
8Karlsruhe Institute of Technology
Germany
58.6 (54.5)
9Pierre and Marie Curie University
France
58.2  (57)
10Scuola Superiore Normale Pisa
Italy
57.3  (50.2)

Most of the 150 universities were outside the top 200 in the world rankings and did not receive an overall score (although it could be calculated easily enough) so there is some new data here. 

As expected the young universities rankings have received a lot of publicity from the world media. Here's a sample.

University of Calgary ranks again as a top young global university by Times Higher Education










Thursday, April 14, 2016

Are there any more rankings left?

There seems to be an unending stream of new rankings. So far we have had from the big three or four subject rankings, field rankings, European, Asian, African, Latin American, Middle East and North Africa rankings, BRICS rankings. BRICS and emerging economies rankings, reputation rankings, young universities, old universities, most international universities rankings, research income from industry rankings.

From outside the charmed triangle or square we have had random rankings, length of name rankings,green rankings, twitter and LinkedIn rankings and rich universities rankings and of course in the USA a mixed bag of best universities for squirrels, gay friendly, top party schools and so on. I am a keen follower of the latter: when the US Air Force Academy gets in the top ten I shall pack up and move to Antarctica.

So are there any more international university rankings in the pipeline?

A few suggestions. Commonwealth universities, OIC universities, cold universities, high universities (altitude that is), poor universities, fertile universities (measured by branch campuses).

One that would be fun to see would be a Research Impact ranking based on those universities that have achieved a top  placing in the THE year- and field- normalised, regionally modified, standardised, citations ranking.

Some notable inclusions would be St. George's University of London, Rice University, Tokyo Metropolitan University, Federico Santa Maria Technical University, Florida Institute of Technology, National Research Nuclear University MEPhI and Alexandria University.



Monday, April 11, 2016

Ranking Rankers' Twitter Accounts

Over at Campus Morning Mail, Steven Matchett reflects on the limited attention that U-Multirank has received compared with THE's ranking of 150 young universities. He notes that THE has 192,000 followers on Twitter, while U-Multirank has only 2,390.

Digging further, here are the number of twitter followers for various people and organisations that have something to do with international university rankings.


World Uni Ranking (THE) 25,400
Phil Baty (THE) 15,300
QS 3,667
Bob Morse (US News) 2,122
Isidro Aguillo (Webometrics) 2,221
Ellie Bothwell, (THE) 2,049
World Univ Ranking (Shanghai) 1,226
Ben Sowter (QS) 1,115
CWTS Leiden 641
Proyecto U-Ranking (Spain) 561
RUR ranking (Russia)   532
5 top 100 (Russia)  355
Centre for Higher Education (Germany) 308
Richard Holmes 175

Sunday, April 10, 2016

Interview with Round University Ranking

See here for the text of an interview with Round University Ranking of Russia.


How to Survive Changes in Ranking Methodology

How to survive changes in ranking methodology