Magnus Gunnarsson of the University of Gothenburg has reminded me of a 2010 report which included an assessment of methodology based on IREG's Berlin principles.
There were 16 Berlin principles, 14 of which were given weighted subscores in the report. The values and weighting were determined subjectively although the rankers were evidently well informed.
The Berlin principles were grouped in four categories, Purpose and Goals of Rankings, Design and Weighting of Indicators, Collection and Processing of Data and Presentation of Ranking Results. For more information see here.
The ranking of rankings by methodology is as follows. It is obviously out of date.
Position
|
Ranking
|
Overall method score
|
---|---|---|
1= | CHE (Germany) | 3.10 |
1= | Webometrics | 3.10 |
3 | HEEACT (now Taiwan National University) |
2.90 |
4 | ARWU (Shanghai) | 2.80 |
5= | High Impact Universities (Australia) | 2.60 |
5= | Observatory (sustainable development) | 2.60 |
7= | Scimago | 2.40 |
7= | CWTS Leiden ranking | 2.40 |
9 | THE | 2.30 |
10 | Mines Paris Tech | 2.20 |
11 | QS | 2.10 |
12 | Rater (Russia) | 1.70 |
No comments:
Post a Comment