One sign of the reliability of the Shanghai Academic Ranking of World Universities is that at the top there is little change from year to year. It is difficult to get excited about Tokyo slipping one place to join University College London in 21st position although I must admit the Federal Institute of Technology in Zurich moving up two whole places is rather intriguing.
These rankings are best used to study changes over a few years. Since 2004, according to data provided by the Shanghai rankers, the following countries have increased their membership of the world's elite of the top 100 universities
Australia +3
Israel +2
USA +1
Switzerland +1
Netherlands +1
Denmark +1
Belgium +1
These countries have seen universities leave the top 100.
Germany -3
Japan -2
UK -2
Sweden -1
Italy -1
Austria -1
At the very top there is no sign of the erosion of English speaking dominance (academically I think Israel can be classed as English speaking). If anything, it is being extended although with a shift from the UK to the US and Australia.
Looking at the top 500, which we might consider to include world class research universities, the picture is different. From 2004 to 2013 the following changes occurred.
China +26
Australia +5
Saudi Arabia +4
South Korea +3
Portugal +3
Brazil +2
New Zealand +2
Spain +1
Sweden +1
Turkey +1
Malaysia +1
Slovenia +1
Iran +1
Egypt +1
Croatia +1
Chile +1
Serbia +1
Mexico +1
USA -21
Japan -16
Germany -5
UK -5
Italy -4
France -2
India -2
Switzerland -1
Netherlands -1
Denmark -1
Hungary -1
Here the big story is the relative decline of the US, Northern Europe, Japan and India and the rise of China and, to a lesser extent, Australia, Korea, Southwest Asia, Southern Europe except Italy and Latin America.
There is very little sign of any Asian renaissance outside Greater China and Korea and maybe the Middle East. India has actually lost ground over the last decade and there is now only one institution from the whole of South Asia and central Asia.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Saturday, August 17, 2013
Wednesday, August 14, 2013
The Webometrics Methodology
Isidro F. Aguillo of Webometrics has kindly sent me a summary of the methodology:
- The ranking intends to measure global performance of the
universities using the web only as a proxy. Web design is mostly
irrelevant, web contents are key if the web policy intends to mirror
all the university missions on the web.
- We have a MODEL for the weighting of the variables in the composite
indicator. It is the traditional "impact factor" developed several
decades ago in bibliometrics adapted to the web: A ratio 1:1 (50%:50%)
between ACTIVITY and IMPACT.
- For measuring IMPACT (visibility?, impact?, quality?) there are
three alternatives: prestige surveys (THE, QS), peers citations
(Leiden, NTU, URAP) or link visibility (number of external inlinks or
backlinks). We use this last option because in this way we acknowledge
a larger diversity of activities and missions and (very important) by
a huge amount of users, a truly global audience.
- Personally I have two important objectives with the ranking: First,
I am a scholar (scholar.google.com/citations?user=SaCSbeoAAAAJ) who is
paid by the Spanish government to make scientific research, so the
ranking provides me with a lot of valuable data useful for analysis
and papers. Second, I have a "political" agenda, that is supporting
Open Access initiatives.
- So, for measuring ACTIVITY the key issue is considering the
full-text documents, so Openness consists of the number of files in
pdf, doc, ppt and similar formats.
- An important innovation in the 3 latest editions is the Excellence
indicator that is not really web related but intends to acknowledge
the research intensive institutions. The data is provided by Scimago
and reflects the top 10% more cited papers in 21 disciplines.
- The ranking intends to measure global performance of the
universities using the web only as a proxy. Web design is mostly
irrelevant, web contents are key if the web policy intends to mirror
all the university missions on the web.
- We have a MODEL for the weighting of the variables in the composite
indicator. It is the traditional "impact factor" developed several
decades ago in bibliometrics adapted to the web: A ratio 1:1 (50%:50%)
between ACTIVITY and IMPACT.
- For measuring IMPACT (visibility?, impact?, quality?) there are
three alternatives: prestige surveys (THE, QS), peers citations
(Leiden, NTU, URAP) or link visibility (number of external inlinks or
backlinks). We use this last option because in this way we acknowledge
a larger diversity of activities and missions and (very important) by
a huge amount of users, a truly global audience.
- Personally I have two important objectives with the ranking: First,
I am a scholar (scholar.google.com/citations?user=SaCSbeoAAAAJ) who is
paid by the Spanish government to make scientific research, so the
ranking provides me with a lot of valuable data useful for analysis
and papers. Second, I have a "political" agenda, that is supporting
Open Access initiatives.
- So, for measuring ACTIVITY the key issue is considering the
full-text documents, so Openness consists of the number of files in
pdf, doc, ppt and similar formats.
- An important innovation in the 3 latest editions is the Excellence
indicator that is not really web related but intends to acknowledge
the research intensive institutions. The data is provided by Scimago
and reflects the top 10% more cited papers in 21 disciplines.
The Webometrics Rankings
The July 2013 Webometrics rankings have just been published. The top five are:
1. Harvard
2. MIT
3. Stanford
4. UC Berkeley
5. UCLA
In first place in various regions are:
Latin America: Sao Paulo
Europe: Oxford
Asia: National University of Singapore
Africa: Kwazulu Natal
Arab World: King Saud University
Oceania: Australian National University
Caribbean: University of Puerto Rico Mayaguez
Middle East: Tel Aviv
South Asia: IIT Bombay
Eastern and Central Europe: Lomonosov Moscow State University
1. Harvard
2. MIT
3. Stanford
4. UC Berkeley
5. UCLA
In first place in various regions are:
Latin America: Sao Paulo
Europe: Oxford
Asia: National University of Singapore
Africa: Kwazulu Natal
Arab World: King Saud University
Oceania: Australian National University
Caribbean: University of Puerto Rico Mayaguez
Middle East: Tel Aviv
South Asia: IIT Bombay
Eastern and Central Europe: Lomonosov Moscow State University
Saturday, August 03, 2013
The Forbes Rankings
Forbes has just released its Best College list, which is compiled by the Center of College Affordability and Productivity. This index reflects student needs and the indicators include quality of teaching, student debt and graduate employability.
Stanford is at the top, Harvard is eighth and Caltech 18th. The armed forces academies and small liberal arts colleges do well.
The top five are:
1. Stanford
2. Pomona College.
3. Princeton
4. Yale
5. Columbia
Subscribe to:
Posts (Atom)