Dancing in the Street in Moscow
"Jubilant crowds poured into the streets of Moscow when it was announced that Moscow State Engineering Physics Institute had been declared to be the joint top university in the world, along with Rice University in Texas, for research impact".
Just kidding about the celebrations.
But the Times Higher Education - Thomson Reuters World University Rankings have given the "Moscow State Engineering Physics Institute"
a score of 100 for research impact, which is measured by the number of citations per paper normalised by field, year of publication and country.
There are a couple of odd things about this.
First, "Moscow State Engineering Physics Institute " was reorganised in 2009 and its official title is now
National Research Nuclear University MEPhI. It still seems to be normal to refer to MEPhI or Moscow State Engineering Physics Institute so I will not argue about this. But I wonder if there has been some confusion in TR's data collection.
Second, THE says that institutions
are not ranked if they teach only a single narrow subject. Does the institution teach more than just physics?
So how did MEPhI do it ? The answer seems to be because of a couple of massively cited review articles. The first was by
C Amsler et (many many) alia in
Physics Letters B of September 2008, entitled Review of Particle Physics. It was cited 1278 times in 2009 and 1627 times in 2010 according to the Web of Science, even more according to Google Scholar.
Here is the abstract.
"Abstract: This biennial Review summarizes
much of particle physics. Using data from previous editions., plus 2778
new measurements from 645 papers, we list, evaluate, and average
measured properties of gauge bosons, leptons, quarks, mesons, and
baryons. We also summarize searches for hypothetical particles such as
Higgs bosons, heavy neutrinos, and supersymmetric particles. All the
particle properties and search limits are listed in Summary Tables. We
also give numerous tables, figures, formulae, and reviews of topics such
as the Standard Model, particle detectors., probability, and
statistics. Among the 108 reviews are many that are new or heavily
revised including those on CKM quark-mixing matrix, V-ud & V-us,
V-cb & V-ub, top quark, muon anomalous magnetic moment, extra
dimensions, particle detectors, cosmic background radiation, dark
matter, cosmological parameters, and big bang cosmology".
I have not counted the number of authors but there are113 institutional affiliations of which MEPhI is 84th.
The second paper is by K. Nakamura et alia. It is also entitled Review of Particle Physics and was published in the
Journal Of Physics G-Nuclear and Particle Physics in July 2010 . It was cited 1240 times in 2011. This is the abstract.
"This biennial Review summarizes much of particle physics. Using data
from previous editions, plus 2158 new measurements from 551 papers, we
list, evaluate, and average measured properties of gauge bosons,
leptons, quarks, mesons, and baryons. We also summarize searches for
hypothetical particles such as Higgs bosons, heavy neutrinos, and
supersymmetric particles. All the particle properties and search limits
are listed in Summary Tables. We also give numerous tables, figures,
formulae, and reviews of topics such as the Standard Model, particle
detectors, probability, and statistics. Among the 108 reviews are many
that are new or heavily revised including those on neutrino mass,
mixing, and oscillations, QCD, top quark, CKM quark-mixing matrix, V-ud
& V-us, V-cb & V-ub, fragmentation functions, particle detectors
for accelerator and non-accelerator physics, magnetic monopoles,
cosmological parameters, and big bang cosmology".
There are 119 affiliations of which MEPhI is 91st..
Let me stress that there is nothing improper here. It is normal for papers in the physical sciences to include summaries or reviews of research at the beginning of a literature review. I also assume that the similarity in the wording of the abstracts would be considered appropriate standardisation within the discipline rather than plagiarism.
TR 's method counts the numbers of citations of a paper compared to the average for that field in that year in that country. MEPhI would not get very much credit for a publication in physics which is a quite highly cited discipline, but it would get some for being in Russia where citations in English are relatively sparse and a massive boost for exceeding the average for citations within one or two years of publication many times over.
There is one other factor. MEPhI was only one of more than 100 institutions contributing to each of these papers but it got such an unusually massive score because its citations, which were magnified by region and period of publication, were divided by a comparatively small number of publications.
This is not as bad as Alexandria University being declared the fourth best university for research impact in 2010. MEPhI is a genuinely excellent institution which Alexandria, despite a solitary Nobel laureate and an historic library, was not. But does it really deserve to be number one for research impact or even in the top 100? TR's methods are in need of very thorough revision.
And I haven't heard about any celebrations in Houston either.