Showing posts sorted by date for query MEPhi. Sort by relevance Show all posts
Showing posts sorted by date for query MEPhi. Sort by relevance Show all posts

Saturday, June 27, 2015

Why Russia Might Rise Fairly Quickly in the Rankings After Falling a Bit

An article  by Alex Usher in Higher Education in Russia and Beyond, reprinted in University World News, suggests five structural reasons why Russian universities will not rise very quickly in the global rankings. These are:


  • the concentration of resources in academies rather than universities


  • excessive specialisation among existing universities


  • a shortage of researchers  caused by the economic crisis of the nineties


  • excessive bureaucratic control over research projects

  • limited fluency in English.

Over the next couple of years things might even get a bit worse. QS are considering introducing a sensible form of field normalisation, just for the five main subject groups. This might not happen since they are well aware of the further advantages this will give to English speaking universities, especially Oxbridge and places like Yale and Princeton, that are strong in the humanities and social sciences. But if it did it would not be good for Russian universities. Meanwhile, THE has spoken about doing something about hugely cited multi-authored physics papers and that could drastically affect institutions like MEPhI.

But after that, there are special features in the QS and THE world rankings that could be exploited by Russian universities. 

Russia is surrounded by former Soviet countries where Russian is widely used and which could provide large numbers of international research collaborators, an indicator in the THE rankings, and could be a source of international students and faculty, indicators in the THE and QS rankings and a source of respondents to the THE and QS academic surveys.

Russia might also consider tapping the Chinese supply of bright students for STEM subjects. It is likely that the red bourgeoisie will start wondering about the wisdom of sending their heirs to universities that give academic credit for things like walking around with a mattress or  not shaving armpit hair and think about a degree in engineering from Moscow State or MEPhI.

Russian universities also appear to have a strong bias towards applied sciences and vocational training that should, if marketed properly, produce high scores in the QS employer survey and the THE Industry Income: Innovation indicator.






Saturday, December 21, 2013

Twenty Ways to Rise in the Rankings Quickly and Fairly Painlessly


Times Higher Education has just republished an article by Amanda Goodall, ‘Top 20 ways to improve your world university ranking’.  Much of her advice is very sensible -- appointing university leaders with a strong research record, for example -- but in most cases the road from her suggestions to a perceptible improvement in the rankings is likely to be winding and very long. It is unlikely that any of her proposals would have much effect on the rankings in less than a decade or even two.


So here are 20 realistic proposals for a university wishing to join the rankings game.


Before starting, any advice about how a university can rise in the rankings should be based on these principles.


·         Rankings are proliferating and no doubt there will be more in the future. There is something for almost anybody if you look carefully enough.


·         The indicators and methodology of the better known rankings are very different. Something that works with one may not work with another. It might even have a negative effect.


·         There is often a price to pay for getting ahead in the rankings. Everybody should consider whether it is worth it. Also, while rising from 300th place to 250th is quite easy, going from 30th to 25th is another matter.


·         Don’t forget the number on the bottom. It might be easier to reduce the number of academic staff than to increase the number of citations or publications.

 ·         Rankings are at best an approximation to what universities do. Nobody should get too excited about them.


The top 20 ways in which universities can quickly improve their positions in one or more of the international university rankings are:

 1.  Get rid of students

Over the years many universities acquire a collection of branch campuses, general studies programmes, night schools, pre-degree programmes and so on. Set them free to become independent universities or colleges. Almost always, these places have relatively more students and relatively fewer faculty than the main campus. The university will therefore do better in the Quacquarelli Symonds (QS) and Times Higher Education (THE) faculty student ratio indicators.  Also, staff in the spun off branches and schools generally produce less research than those at the main campus so you will get a boost in the productivity per capita indicator in the Shanghai ARWU rankings.

2.  Kick out the old and bring in the young

Get rid of ageing professors, especially if unproductive and expensive, and hire lots of indentured servants adjunct and temporary teachers and researchers. Again, this will improve the university’s performance on the THE and QS faculty student ratio indicators. They will not count as senior faculty so this will be helpful for ARWU.

3.  Hire research assistants

Recruiting slave labour cheap or unpaid research assistants (unemployed or unemployable graduate interns?) will boost the score for faculty student ratio in the QS rankings, since QS counts research-only staff for their faculty student indicator. It will not, however, work for the THE rankings.  Remember that for QS more faculty are good for faculty student ratio but bad for citations per faculty so you have to analyse the potential trade off carefully.

4.  Think about an exit option

If an emerging university wants to be included in the rankings it might be better to focus on just one of them.  Panjab University is doing very well in the THE rankings but does not appear in the QS rankings. But remember that if you apply to be ranked by THE and you do not like your placing then it is always possible to opt out by not submitting data next year. But QS has a Hotel California policy: once in, you can check out but you can never leave. It does not matter how much you complain about the unique qualities of your institution and how they are neglected by the rankers, QS will go on ranking you whether you like it.

5. Get a medical school

 If you do not have a medical school or a research and/or teaching hospital then get one from somewhere. Merge with an existing one or start your own. If you have one, get another one. Medical research produces a disproportionate number of papers and citations which is good for the QS citations per faculty indicator and the ARWU publications indicator. Remember this strategy may not help so much with THE who use field normalisation. Those citations of medical research will help there only if they above the world average for field and year.

Update August 2016: QS now have a moderate form of field normalisation so the advantage of a medical school is reduced but the Shanghai rankings are still biased towards medical research.

6. But if you are a medical school, diversify

QS and THE supposedly do not include single subject institutions in their general rankings, although from time to time one will, like the University of California at San Francisco, Aston Business School or (National Research Nuclear University) Moscow Engineering Physics Institute (MEPhI), slip through.  If you are an independent medical or single subject institution consider adding one or two more subjects then QS and THE will count you although you will probably  start sliding down the ARWU table.

Update August 2016: the QS BRICS rankings include some Russian institutions that look like they focus on one field and National Research Nuclear University MePhI is back in the THE world rankings.

7. Amalgamate

The Shanghai rankings count the total number of publications in the SCI and SSCI, the total number of highly cited researchers and the total number of papers without regard for the number of researchers. THE and QS count the number of votes in their surveys without considering the number of alumni.

What about a new mega university formed by merging LSE, University College London and Imperial College? Or a tres grande ecole from all those little grandes ecoles around Paris?

Update August 2016: This is pretty much what the University of Paris-Saclay is doing.

8. Consider the weighting of the rankings

THE gives a 30 % weighting to citations and 2.5% to income from industry. QS gives 40 % to its academic survey and 5 % to international faculty. So think about where you are going to spend your money.

9.  The wisdom of crowds

Focus on research projects in those fields that have huge multi - “author”  publications, particle physics, astronomy and medicine for example.  Such publications often have very large numbers of citations. Even if your researchers make a one in two thousandth contribution Thomson Reuters, THE’s data collector, will give them the same credit as they would get if they were the only authors. This will not work for the Leiden Ranking which uses fractionalised counting of citations. Note that this strategy works best when combined with number 10.

Update August 2016: THE methodological changes in 2015 mean that this does not work any more. Look at what happened to Middle East Technical University. But it is still worth looking out for projects with dozens or scores of contributors. 

10.  Do not produce too much

You need to produce 200 papers a year to be included in the THE rankings. But producing more papers than this might be counterproductive. If your researchers are producing five thousand papers a year then those five hundred citations from a five hundred “author” report on the latest discovery in particle physics will not have much impact. But if you are publishing three hundred papers a year those citations will make a very big difference. This is why Dr El Naschie’s frequently cited papers in Chaos, Solitons and Fractals were a big boost for Alexandria University but not for Cambridge, Surrey, Cornell and Frankfurt universities with whom he also claimed affiliation. However, Leiden will not rank universities until they reach 500 papers a year.

Update August 2016: See number 9.

11.  Moneyball Strategy

In his book Moneyball, Michael Lewis recounted the ascent of the Oakland As baseball team through a strategy of buying undervalued players. The idea was to find players who did things that led to their teams winning even if they did not match the stereotype of a talented player.

This strategy was applied by George Mason University in Virginia who created a top basketball team by recruiting players who were overlooked by scouts because they were too small or too fat and a top economics department by recruiting advocates of a market economy at a time when such an idea was unfashionable.

Universities could recruit researchers who are prolific and competent but are unpromotable or unemployable because they are in the wrong group or fail to subscribe enthusiastically to current academic orthodoxies. Maybe start with Mark Regnerus and Jason Richwine.

Update August 2016: See the story of Tim Groseclose's move from UCLA to George Mason

12. Expand doctoral programmes

One indicator in the THE world rankings is the ratio of doctoral to bachelor degree students.

Panjab University recently announced that they will introduce integrated masters and doctors programmes. This could be a smart move if it means students no longer go into master’s programmes but instead into something that can be counted as a doctoral degree program.

13.  The importance of names

Make sure that your researchers know which university they are affiliated to and that they know its correct name. Make sure that branch campuses, research institutes and other autonomous or quasi- autonomous groups incorporate the university name in their publications. Keep an eye on Scopus and ISI and make sure they know what you are called. Be especially careful if you are an American state university.

14.   Evaluate staff according to criteria relevant to the rankings

If staff are to be appointed and promoted according to their collegiality,  the enthusiasm with which they take part in ISO exercises,  community service, ability to make the faculty a pleasant place for everybody  or commitment to diversity then you will get collegial, enthusiastic etc faculty. But those are things that the rankers do not – for once with good reason – attempt to measure.

While you are about it get rid of interviews for staff and students. Predictive validity ranges from zero to low

15.  Collaborate

The more authors a paper has the more likely it is to be cited, even if it is only self-citation.  Also, the more collaborators you have the greater the chances of a good score in the reputation surveys. And do not forget the percentage of collaborators who are international is also an indicator in the THE rankings

16. Rebrand

It would be good to have names that are as distinctive and memorable as possible. Consider a name change. Do you really think that the average scientist filling out the QS or the THE reputation surveys is going to remember which of the sixteen (?) Indian Institutes of Technology is especially good in engineering.

Update August 2016: But not too memorable. I doubt that Lovely Professional University will get the sort of public interest it is hoping for.

17. Be proactive

Rankings are changing all the time so think about indicators that might be introduced in the near future. It would seem quite easy, for example, for rankers to collect data about patent applications.

Update August 2016: Make sure everyone keeps their Google Scholar Citations Profiles up to date.

18. Support your local independence movement

It has been known for a long time that increasing the number of international students and faculty is good for both the THE and QS rankings. But there are drawbacks to just importing students. If it is difficult to move students across borders why not create new borders?

If Scotland votes for independence in next year’s referendum its scores for international students and international faculty in the QS and THE rankings would go up since English and Welsh students and staff would be counted as international.

Update August 2016: Scotland didn't but there may be another chance. 

19. Accept that some things will never work

Realise that there are some things that are quite pointless from a rankings perspective. Or any other for that matter.  Do not bother telling staff and students to click away at the website to get into Webometrics. Believe it or not, there are precautions against that sort of thing. Do not have motivational weekends. Do not have quality initiatives unless they get rid of the cats.

Update August 2016: That should read do not do anything "motivational". The only thing they motivate is the departure of people with other options.

20.  Get Thee to an Island

Leiden Ranking has a little known ranking that measures the distance between collaborators. At the moment the first place goes to the Australian National University. Move to Easter Island or the Falklands and you will be top for something.

Friday, October 04, 2013

MIT and TMU are the most influential research universities in the world

I hope to comment extensively on the new Times Higher Education - Thomson Reuters rankings in a while but for the moment here is a comment on the citations indicator.

Last year Times Higher Education and Thomson Reuters solemnly informed the world that the two most influential places for research were Rice University in Texas and the Moscow State Engineering Physics Institute (MEPhI).

Now, the top two for Citations: research influence are MIT, which sounds a bit more sensible than Rice, and Tokyo Metropolitan University. Rice has slipped very slightly and MEPhI has disappeared from the general rankings because it was realised that it is a single-subject institution. I wonder how they worked that out.

That may be a bit unfair. What about that paper on opposition politics in central Russia in the 1920s?

Tokyo Metropolitan University's success at first seems rather odd because it also has a very low score for Research, which probably means that it has a poor reputation for research, does not receive much funding, has few graduate students and/or publishes few papers. So how could its research be so influential?

The answer is that it was one of scores of contributors to a couple of multi-authored publications on particle physics and a handful of widely cited papers in genetics and also produced few papers overall. I will let Thomson Reuters explain how that makes it into a pocket or a mountain of excellence.

Tuesday, June 25, 2013

What about a Research Influence Ranking?

Keeping up with the current surge of global university rankings is becoming next to impossible. Still there are  a few niches that have remained unoccupied. One might be a ranking of universities according to their ability to spread new knowledge around the world. So it might be a good idea to have a Research Influence Ranking based on the citations indicator in the Times Higher Education -- Thomson Reuters World University Rankings.

Thomson Reuters are the world's leading collectors and analysts of citations data so such an index ought to provide invaluable data source for governments, corporations and other stakeholders deciding where to place research funding. Data for 400 universities can be found on the THE iPhone/iPad app.

The top place in the world would be jointly held by Rice University in Texas and Moscow State Engineering Physics Institute, closely followed by MIT and the University of California Santa Cruz.

Then there are the first places in various regions and counties. (MEPhI would be first in Europe and Rice in the US and North America.)

Canada
University of Toronto

Latin America
University of the Andes, Colombia

United Kingdom (and Western Europe)
Royal Holloway London

Africa
University of Cape Town

Middle East
Koc University, Turkey

Asia (and Japan)
Tokyo Metropolitan University

ASEAN
King Mongkut's University of Technology, Thailand

Australia and the Pacific
University of Melbourne

On second thoughts, perhaps not such a good idea.


Saturday, April 20, 2013

The Leiden Ranking

The Leiden ranking for 2013 is out. This is produced by the Centre for Science and Technology Studies (CWTS) at Leiden University and represents pretty much the state of the art in assessing research publications and citations.

A variety of indicators are presented with several different settings but no overall winner is declared which means that these rankings are not going to get the publicity given to QS and Times Higher Education.

Here are top universities, using the default settings provided by CWTS.

Total Publications: Harvard
Citations per Paper: MIT
Normalised Citations per Paper: MIT
Quality of Publications: MIT

There are also indicators for international and industrial collaboration that I hope to discuss later.

It is also noticeable that high flyers in the Times Higher Education citations indicator, Alexandria University, Moscow Engineering Physics Institute (MEPhI), Hong Kong Baptist University, Royal Holloway, do not figure at all in the Leiden Ranking. What happened to them?

How could MEPhI, equal first in the world for research influence according to THE and Thomson Reuters, fail to even show up in the normalised citation indicator in the Leiden Ranking?

Firstly, Leiden have collected data for the top 500 universities in the world according to number of publications in the Web of Science. That would have been sufficient to keep these institutions out of the rankings.

In addition, Leiden use fractionalised counting as a default setting so that the impact of mutiple-author publications is divided by the number of university addresses. This would drastically reduce the impact of publications like the Review of Particle Physics.

Also, by field Leiden mean five broad subject groups whereas Thomson Reuters appears to use a larger number (21 if they use the same system as they do for highly cited researchers.) There is accordingly more chance of anomalous cases having a great influence in the THE rankings.

THE and Thomson Reuters would do well to look at the multi-authored, and most probably soon to be multi-cited, papers that were published in 2012 and look at the universities that could do well in 2014 if the methodology remains unchanged.


Sunday, November 18, 2012

Article in University World News

online hub
    s View Printable VersionEmail Article To a Friend
GLOBAL
Ranking’s research impact indicator is skewed

Saturday, October 27, 2012

More on MEPhI

Right after putting up the post on Moscow State Engineering Physics Institute and its "achievement" in getting the maximum score for research impact in the latest THE - TR World University Rankings, I found this exchange on Facebook.  See my comments at the end.

  • Valery Adzhiev So, the best university in the world in the "citation" (i.e. "research influence") category is Moscow State Engineering Physics Institute with maximum '100' score. This is remarkable achivement by any standards. At the same time it scored in "research" just 10.6 (out of 100) which is very, very low result. How on earth that can be?
  • Times Higher Education World University Rankings Hi Valery,

    Regarding MEPHI’s high citation impact, there are two causes: Firstly they have a couple of extremely highly cited papers out of a very low volume of papers.The two extremely highly cited papers are skewing what would ordinarily be a very g
    ood normalized citation impact to an even higher level.

    We also apply "regional modification" to the Normalized Citation Impact. This is an adjustment that we make to take into account the different citation cultures of each country (because of things like language and research policy). In the case of Russia, because the underlying citation impact of the country is low it means that Russian universities get a bit of a boost for the Normalized Citation Impact.

    MEPHI is right on the boundary for meeting the minimum requirement for the THE World University Rankings, and for this reason was excluded from the rankings in previous years. There is still a big concern with the number of papers being so low and I think we may see MEPHI’s citation impact change considerably over time as the effect of the above mentioned 2 papers go out of the system (although there will probably be new ones come in).

    Hope this helps to explain things.
    THE
  • Valery Adzhiev Thanks for your prompt reply. Unfortunately, the closer look at that case only adds rather awkward questions. "a couple of extremely highly cited papers are actually not "papers": they are biannual volumes titled "The Review of Particle Physics" that ...See More
  • Valery Adzhiev I continue. There are more than 200 authors (in fact, they are "editors") from more than 100 organisation from all over the world, who produce those volumes. Look: just one of them happened to be affiliated with MEPhI - and that rather modest fact (tha...See More
  • Valery Adzhiev Sorry, another addition: I'd just want to repeat that my point is not concerned only with MEPhI - Am talking about your methodology. Look at the "citation score" of some other universities. Royal Holloway, University of London having justt 27.7 in "res...See More
  • Alvin See Great observations, Valery.
  • Times Higher Education World University Rankings Hi Valery,

    Thanks again for your thorough analysis. The citation score is one of 13 indicators within what is a balanced and comprehensive system. Everything is put in place to ensure a balanced overall result, and we put our methodology up online for
    ...See More
  • Andrei Rostovtsev This is in fact rather philosofical point. There are also a number of very scandalous papers with definitively negative scientific impact, but making a lot of noise around. Those have also high contribution to the citation score, but negative impact t...See More

    It is true that two extremely highly cited publications combined with a low total number of publications skewed the results but what is equally or perhaps more important is that  these citations occur in the year or two years after publication when citations tend to be relatively infrequent compared to later years. The 2010 publication is a biennial review, like the 2008 publication, that will be cited copiously for two years after which it will no doubt be superseded by the 2012 edition.

    Also, we should note that in the ISI Web of Science, the 2008 publication is classified as "physics, multidisciplinary". Papers listed as multidisciplinary generally get relatively few citations so if the publication was compared to other multidisciplinary papers it would get an even larger weighting. 
    Valery has an excellent point when he points out that these publications have over 100 authors or contributors each (I am not sure whether they are actual researchers or administrators). Why then did not all the other contributors boost their instutitions' scores to similar heights? Partly because they were not in Russia and therefore did not get the regional weighting but also because they were publishing many more papers overall than MEPhI.  

    So basically, A. Romaniouk who contributed 1/173rd of one publication was considered as having more research impact than hundreds of researchers at Harvard, MIT, Caltech etc producing hundreds of papers cited hundreds of times.  Sorry, but is this a ranking of research quality or a lottery?

    The worse part of THE's reply is this:

    Thanks again for your thorough analysis. The citation score is one of 13 indicators within what is a balanced and comprehensive system. Everything is put in place to ensure a balanced overall result, and we put our methodology up online for all to see (and indeed scrutinise, which everyone is entitled to do).

    We welcome feedback, are constantly developing our system, and will definitely take your comments on board.

    The system is not balanced. Citations have a weighting of 30 %, much more than any other  indicator. Even the research reputation survey has a weighting of only 18%.  And to describe as comprehensive an indicator which allows a fraction of one or two publications to surpass massive amounts of original and influential research is really plumbing the depths of absurdity.

    I am just about to finish comparing the scores for research and research impact for the top 400 universities. There is a statistically significant correlation but it is quite modest. When research reputation, volume of publications and research income show such a modestcorrelation with research impact it is time to ask whether there is a serious problem with this indicator.

    Here is some advice for THE and TR.

    • First, and surely very obvious, if you are going to use field normalisation then calculate the score for discipline groups, natural sciences, social sciences and so on and aggregate the scores. So give MEPhI a 100 for physical or natural sciences if you think they deserve it but not for the arts and humanities.
    • Second, and also obvious, introduce fractional counting, that is dividing the number of citations by the number of authors of the cited paper.
    • Do not count citations to summaries, reviews or compilations of research.
    • Do not count citations of commercial material about computer programs. This would reduce the very high and implausible score for Gottingen which is derived from a single publication.
    • Do not assess research impact with only one indicator. See the Leiden ranking for the many ways of rating research.
    • Consider whether it is appropriate to have a regional weighting. This is after all an international ranking.
    • Reduce the weighting for this indicator.
    • Do not count self-citations. Better yet do  not count citations from researchers at the same university.
    • Strictly enforce your rule about  not including single subject institutions in the general rankings.
    • Increase the threshold number of publications for inclusion in the rankings from two hundred to four hundred.