Phil Baty, editor of the Times Higher Education World University Rankings, has indicated that the publication of a paper from the ATLAS and CMS experiments at the CERN Large Hadron Collider project is a challenge for rankers.
The paper in question has a total of 5,154 authors, if that is the right word, with sole or primary affiliation to 344 institutions. Of those authors 104 have a secondary affiliation. One is deceased. Under THE's current methodology every institution contributing to the paper will get credit for all the citations that the paper will receive, which is very likely to run into the thousands.
For the elite universities participating in these projects a few thousand citations will make little or no difference. But for for a small specialised institution or a large one that does little research, those citations spread out over a few hundred papers could make a big difference.
In last year's rankings places like Florida Institute of Technology, Universite Marrakesh Cadi Ayyad, Morroco, Federico Santa Maria Technical University, Chile, Bogazici University, Turkey, got implausibly high scores for citations that were were well ahead of those for the other criteria.
The paper in question does set a record for the number of contributors although the challenge is not particularly new.
At a seminar in Moscow earlier this year, Baty suggested that THE, now independent of Thomson Reuters, was considering using fractionated counting, dividing all the citations among the contributing institutions.
This would be an excellent idea and should be technically quite feasible since CWTS at Leiden University use it as their default option.
But there would be a a price to pay. The current methodology allows THE to boast that it has found a way of uncovering hitherto unnoticed pockets of excellence. It is also a selling point in THE's imperial designs of expanding into regions where there has so far been little interest in ranking, Russia, the Middle East, Africa, the BRICS. A few universities in those regions could make a splash in the rankings if they recruited, even as an adjunct, a researcher working on the LHC project.
It would be most welcome if THE does start using fractionated counting in its citation indication. Also welcome would be some other changes: not counting self-citation, reducing the weighting for the indicator, including several different methods of evaluating research impact or quality, and, especially important, getting rid of the "regional modification" that awards a bonus for being located in a low scoring country.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Friday, May 29, 2015
Friday, May 22, 2015
An Experiment Using LinkedIn Data to Rank Arab Universities
University World News recently published an article by Rahul Choudaha suggesting that LinkedIn is the future of global rankings. At the moment that sounds a bit exaggerated and LinkedIn in its present form may be gone in a decade but he could be on to something.
Leaving Europe, North America and East Asia aside, the reliability of institutional data is very low and that makes serious evaluation of graduate outcomes, staff quality, income, teaching resources and so on extremely difficult.
This problem is especially acute for the Middle East and North Africa region where there appears to be a big demand for university rankings but little accurate information. The consequence has been some highly implausible results in the rankings attempted so far. Last year THE produced a "snapshot"of a ranking indicator which put Texas A&M Qatar as the top university for research impact.and QS's pilot rankings have the American University of Sharjah in joint first place for academic reputation, Al-Nahrain University top for faculty student ratio and Khalifa University top for papers per faculty.
So, here is a list of Arab universities ordered by the number of students or professionals putting them on the Decision Board, indicating an interest in attending Counting was done on the 14th of May.
If this approximates to reputation among students and the public then it seems that Egyptian universities have been undervalued n previous ranking exercises.
Rank | University | Country | Interested in attending |
---|---|---|---|
1 | Helwan University | Egypt | 422 |
2 | American University in Cairo | Egypt | 394 |
3 | Arab Academy of Science, Technology and Maritime Transport | Egypt | 359 |
4 | Cairo University | Egypt | 353 |
5 | Ain Shams University | Egypt | 245 |
6 | Alexandria University | Egypt | 230 |
7 | King Fahd University of Petroleum and Minerals | Saudi Arabia | 211 |
8 | American University of Beirut | Lebanon | 193 |
9 | École Nationale Polytechnique d'Alger | Algeria | 184 |
10 | King Saud University | Saudi Arabia | 138 |
11 | Lebanese American University | Lebanon | 133 |
12 | American University in Dubai | UAE | 131 |
13 | Qatar University | Qatar | 102 |
14 | American University of Sharjah | UAE | 91 |
15 | King Abdullah University of Science and Technology | Saudi Arabia | 85 |
16= | Al Azhar University | Egypt | 78 |
16= | University of Dubai | UAE | 78 |
18 | Damascus University | Syria | 73 |
19 | University of Dammam | Saudi Arabia | 70 |
20= | Mansoura Univerdity | Egypt | 68 |
20= | Houari Boumediene University of Science and Technology | Algeria | 68 |
22 | UAE University | UAE | 62 |
23 | Higher Colleges of Technology | UAE | 58 |
24= | Tanta University | Egypt | 51 |
24= | German University in Cairo | Egypt | 51 |
26 | Zagazig University | Egypt | 50 |
27= | Suez Canal University | Egypt | 43 |
27= | King Abdulaziz University | Saudi Arabia | 43 |
27= | Umm Al-Qura University | Saudi Arabia | 43 |
30= | Abu Dhabi UNiversity | UAE | 33 |
30= | Ajman University of Science & Technology | UAE | 33 |
32 | Assiut Universit | Egypt | 32 |
33 | Université Mentouri de Constantine | Algeria | 27 |
34 | Université Libanaise | Lebanon | 26 |
35 | Al-Imam Mohamed Ibn Saud Islamic University | Saudi Arabia | 23 |
36 | Université Saad Dahlab Blida | Algeria | 22 |
37 | Prince Sultan University | Saudi Arabia | 21 |
38= | King Faisal University | Saudi Arabia | 20 |
38= | Université Mouloud Mammeri de Tizi Ouzo | Algeria | 20 |
40 | Université Badji Mokhtar de Annaba | Algeria | 19 |
41 | Khalifa University | UAE | 19 |
42= | Université de Batna | Algeria | 18 |
42= | Université Cadi Ayyad Marrakech | Morocco | 18 |
44= | King Khalid University | Saudi Arabia | 17 |
44= | Sanaa University | Yemen | 17 |
46 | University of Bejaia | Jordan | 16 |
47= | Zayed University | UAE | 14 |
47= | Université Sidi Mohammed Ben Abdellah | Morocco | 14 |
49= | Masdar Intiute of Science and Technology | UAE | 13 |
49= | Université d'Oran | Algeria | 13 |
51= | Yarmouk University | Jordan | 12 |
51= | Universite de Tunis El Manar | Tunisia | 12 |
53= | Texas A&M Qatar | Qater | 11 |
53= | University of Sharjah | UAE | 11 |
53= | Minia University | Egypt | 11 |
53= | University of Tunis | Tunisia | 11 |
53= | Universite de Monastir | Tunisia | 11 |
58= | University of Jordan | Jordan | 10 |
58= | Benha University | Egypt | 10 |
58= | University of Bahrain | Bahrain | 10 |
61 | Taif University | Saudi Arabia | 0 |
62 | Kuwait University | Kuwait | 0 |
63 | University of Baghdad | Iraq | 0 |
64 | University of Khartoum | Sudan | 0 |
65 | Jordan University of Science and Technology | Jordan | 0 |
66 | Mosul University | Iraq | 0 |
67 | Qassim University | Saudi Arabia | 0 |
68 | Taibah University | Saudi Arabia | 0 |
69 | Hashemite University | Jordan | 0 |
70 | Université Abou Bekr Belkaid Tlemcen | Algeria | 0 |
71 | Al Balqa Applied University | Algeria | 0 |
72 | Babylon University | Iraq | 0 |
73 | South Valley University | Egypt | 0 |
74 | Meoufia University | Egypt | 0 |
75 | Fayoum University | Egypt | 0 |
76 | Sohag University | Egypt | 0 |
77 | Beni-Suef University | Egypt | 0 |
78 | Jazan University | Saudi Arabia | 0 |
79 | Universite de Sfax | Tunisia | 0 |
80 | Al Nahrain University | Iraq | 0 |
81 | University of Basrah | Iraq | 0 |
82 | King Saud bin Abdulaziz University for Health Sciences | Saudi Arabia | 0 |
83 | Université Mohammed V Agdal | Morocco | 0 |
84 | Alfaisal University | Saudi Arabia | 0 |
85 | Arabian Gulf University | Bahrain | 0 |
86= | Petroleum Institute Abu Dhabi | UAE | 0 |
86= | National Engineering School of Sfax | Tunisia | 0 |
88 | Mutah University | Jordan | 0 |
89 | Kafrelsheikh University | Egypt | 0 |
90 | Université de Carthage (7 de Novembre) | Tunisia | 0 |
91 | University of Balamand | Lebanon | 0 |
92 | Beirut Arab University | Lebanon | 0 |
93 | Université Hassan II Mohammadia | Morocco | 0 |
94 | Universite de Sousse | Tunisia | 0 |
95 | Université Abdelmalek Essaadi | Morocco | 0 |
96 | Petra University | Jordan | 0 |
97 | Djillali Liabes University | Algeria | 0 |
98 | Université Ferhat Abbas Setif | Algeria | 0 |
99 | Princess Sumaya University for Technology | Jordan | 0 |
100 | Université de la Manouba | Tunisia | 0 |
101 | Université Ibn Tofail Kénitra | Morocco | 0 |
102 | Université Saint Joseph de Beyrouth | Lebanon | 0 |
103 | Université de Gabes | Tunisia | 0 |
104 | Université Mohammed Premier Oujda | Morocco | 0 |
105 | Mohamed Boudiaf University of Science and Technology | Algeria | 0 |
106 | Sultan Qaboos University | Oman | 0 |
Thursday, May 14, 2015
How to improve your total Contribution in the academic caldener.
I have received several invitations over the last few months to let a team of consultants write up my research and get me into an ISI or Scopus journal. The most recent was from something called Prime Journal Consultants. It is hard to believe that anyone could be so naive as to pay money to someone who writes so badly but who knows? Maybe Chris Olsen has got a doctorate now.
Or maybe standards at Scopus and Thomson Reuters journals are not what they used to be.
Anyway, here is the first part of the message.
"The Most valuable part of your research is the data and study that you have already conducted, its time now to use the study and with our expert assistance create a complete research paper out of it and get it published to the highest impact factor ISI or Scopus Indexed journals to earn Recognition and Promotion.
The Contribution of Research Article Publishing Towards your Promotion
Publication is both a measure of a scholar’s knowledge and also a benchmark for academic success. The minimum percentage for promotion in terms of Research Publication is at least 35-40% of your total Contribution in the academic caldener.
Common Misconception About ISI publishing -Book A Dedicated Consultant Today
ISI Publishing is a time consuming process, The Genuine ISI journals would take time after getting you through rigorous revisions and edits. That is where our Dedicated Consultants Come in to assist you take you through theentire steps to get you an ISI acceptance."
Wednesday, May 13, 2015
The March of Pseudoscience Stumbles a BIt
Pseudoscience continues to thrive in the West. Although -- I think -- no longer offered by universities, homeopathy is still viewed with favour by many in the British establishment, including the Prince of Wales, and has received official recognition in Canada.
Meanwhile in Malaysia Universiti Malaysia Pahang (UMP) has produced an anti-hysteria kit consisting of things like chopsticks, lime, salt, vinegar and pepper spray, which will repel evil spirits. The kit sells for Ringgit 8,750, which includes training and technical support
The Malaysian religious authorities have been more sceptical than the British royal family and treated the kits with derision. UMP has replied by claiming the kit was based on scientific research, although it has not said where the research was published
Meanwhile in Malaysia Universiti Malaysia Pahang (UMP) has produced an anti-hysteria kit consisting of things like chopsticks, lime, salt, vinegar and pepper spray, which will repel evil spirits. The kit sells for Ringgit 8,750, which includes training and technical support
The Malaysian religious authorities have been more sceptical than the British royal family and treated the kits with derision. UMP has replied by claiming the kit was based on scientific research, although it has not said where the research was published
Monday, May 11, 2015
The Geography of Excellence: the Importance of Weighting
So finally, the 2015 QS subject rankings were published. It seems that the first attempt was postponed when the original methodology produced implausible fluctuations, probably resulting from the volatility that is inevitable when there are a small number of data points -- citations and survey responses -- outside the top 50 for certain subjects.
QS have done some tweaking, some of it aimed at smoothing out the fluctuations in the responses to their academic and employer surveys.
These rankings look at bit different from the World University Rankings. Cambridge has the most top ten placings (31), followed by Oxford and Stanford (29 each), Harvard (28), Berkeley (26) and MIT (16).
But in the world rankings MIT is in first place, Cambridge second, Imperial College London third, Harvard fourth and Oxford and University College London joint fifth.
The subject rankings use two indicators from the world, the academic survey and the employer survey but not internationalisation, student faculty ratio and citations per faculty. They add two indicators, citations per paper and h-index.
The result is that the London colleges do less well in the subject rankings since they do not benefit from their large numbers of international students and faculty. Caltech, Princeton and Yale also do relatively badly probably because the new rankings do not take account of their low faculty student faculty ratios.
The lesson of this is that if weighting is not everything, it is definitely very important.
Below is a list of universities ordered by the number of top five placings. There are signs of the Asian advance -- Peking, Hong Kong and the National University of Singapore -- but it is an East Asian advance.
Europe is there too but it is Cold Europe -- Switzerland, Netherlands and Sweden -- not the Mediterranean.
Rank | University | Country | Number of Top Five Places |
---|---|---|---|
1 | Harvard | USA | 26 |
2 | Cambridge | UK | 20 |
3 | Oxford | UK | 18 |
4 | Stanford | USA | 17 |
5= | MIT | USA | 16 |
5= | UC Berkeley | USA | 16 |
7 | London School of Economics | UK | 7 |
8= | University College London | UK | 3 |
8= | ETH Zurich | Switzerland | 3 |
10= | New York University | USA | 2 |
10= | Yale | USA | 2 |
10= | Delft University of Technology | Netherlands | 2 |
10= | National University of Singapore | Singapore | 2 |
10= | UC Los Angeles | USA | 2 |
10= | UC Davis | USA | 2 |
10= | Cornell | USA | 2 |
10= | Wisconsin - Madison | USA | 2 |
10- | Michigan | USA | 2 |
10= | Imperial College London | UK | 2 |
20= | Wagenginen | Netherlands | 1 |
20= | University of Southern California | USA | 1 |
20= | Pratt Institute, New York | USA | 1 |
20= | Rhode Island School of Design | USA | 1 |
20= | Parsons: the New School for Design | USA | 1 |
20= | Royal College of Arts London | UK | 1 |
20= | Melbourne | Australia | 1 |
20= | Texas-Austin | USA | 1 |
20= | Sciences Po | France | 1 |
20= | Princeton | USA | 1 |
20= | Yale | USA | 1 |
20= | Chicago | USA | 1 |
20= | Manchester | UK | 1 |
20= | University of Pennsylvania | USA | 1 |
20= | Durham | UK | 1 |
20= | INSEAD | France | 1 |
20= | London Business School | UK | 1 |
20= | Northwestern | USA | 1 |
20= | Utrecht | Netherlands | 1 |
20= | Guelph | Canada | 1 |
20= | Royal Veterinary College London | UK | 1 |
20= | UC San Francisco | USA | 1 |
20= | Johns Hopkins | USA | 1 |
20= | KU Leuven | USA | 1 |
20= | Gothenburg | Sweden | 1 |
20= | Hong Kong | Hong Kong | 1 |
20= | Karolinska Institute | Sweden | 1 |
20= | Sussex | UK | 1 |
20= | Carnegie Mellon University | USA | 1 |
20= | Rutgers | USA | 1 |
20= | Pittsburgh | USA | 1 |
20= | Peking | China | 1 |
20= | Purdue | USA | 1 |
20= | Georgia Institute ofTechnology | USA | 1 |
20= | Edinburgh | UK | 1 |
Saturday, May 09, 2015
Are all subjects the same?
University rankers seem to be moving towards the field normalization of citations data. In 2010 Times Higher Education and Thomson Reuters started using it for their world rankings. The scores for citations did not reflect the absolute number of citations or even citations per paper or per faculty but citations per paper in relation to the world average for 250 fields. Normalisation by year of citation was added to the process. I have heard that QS is considering normalization by five subject groups. Meanwhile THE has switched to Scopus as a data source and they apparently have 300 fields.
This is justified by the claim that it is unfair that an outstanding paper in history or philosophy should be given the same value as a mediocre one in medicine or physics, something that could happen if only the number of citations were counted. Perhaps, but that assumes that all subjects are equal even if society values them differently and provides more money for some fields and even if they require different levels of cognitive ability.
The website The Tab provides evidence from the Complete Universities Guide (still searching for the original data) that in the UK there are substantial differences in the grades required by universities for various subjects.
The five most difficult subjects measured by points for grades (Advanced level A = 120) are:
Medicine
Dentistry
Physics
Chemical Engineering
Classics.
The least difficult are:
Business and Management
Accounting and Finance
Education
American Studies
Sociology.
This is for undergraduate education in the UK. Looking at future majors of GRE test takers in the US we find something similar Philosophers, physicists and economists are very much brighter than future accountants, social workers, education specialists and public administrators. Engineers perform poorly for verbal aptitude but better for mathematical aptitude. See here and here.
Does it make sense that the average paper in a demanding discipline like physics or philosophy should be treated as exactly the same as the average paper in education or sociology?
This is justified by the claim that it is unfair that an outstanding paper in history or philosophy should be given the same value as a mediocre one in medicine or physics, something that could happen if only the number of citations were counted. Perhaps, but that assumes that all subjects are equal even if society values them differently and provides more money for some fields and even if they require different levels of cognitive ability.
The website The Tab provides evidence from the Complete Universities Guide (still searching for the original data) that in the UK there are substantial differences in the grades required by universities for various subjects.
The five most difficult subjects measured by points for grades (Advanced level A = 120) are:
Medicine
Dentistry
Physics
Chemical Engineering
Classics.
The least difficult are:
Business and Management
Accounting and Finance
Education
American Studies
Sociology.
This is for undergraduate education in the UK. Looking at future majors of GRE test takers in the US we find something similar Philosophers, physicists and economists are very much brighter than future accountants, social workers, education specialists and public administrators. Engineers perform poorly for verbal aptitude but better for mathematical aptitude. See here and here.
Does it make sense that the average paper in a demanding discipline like physics or philosophy should be treated as exactly the same as the average paper in education or sociology?
Wednesday, April 22, 2015
They'll be doing Gourmet Nights next
I have just received an invitation for the Dublin Young Universities Summit on the 29th of this month. It is remarkably restrained: 'prestigious' appears' only three times, 'exclusive' four times and 'distinguished" once.
"...EXCLUSIVE a prestigious networking dinner and world class speakers ... A Rankings masterclass with Phil Baty ..."
"We would be honoured if you would register to attend this prestigious event..."
"Distinguished speakers include..."
"In addition, the event will feature:
"Your ticket will include*:
"...EXCLUSIVE a prestigious networking dinner and world class speakers ... A Rankings masterclass with Phil Baty ..."
"We would be honoured if you would register to attend this prestigious event..."
"Distinguished speakers include..."
"In addition, the event will feature:
- Prestigious networking dinner
- An exclusive rankings masterclass with Mr Phil Baty"
"Your ticket will include*:
- Prestigious welcome dinner
- An exclusive rankings masterclass with Mr Phil Baty, editor at large and rankings editor, THE
- World-class speakers
- 2 x Networking lunches
- Scheduled transfers between the official hotel and the university venues
- Closing networking drinks reception"
Tuesday, April 21, 2015
MENA universities and Google Scholar Results
For reference purposes only, here are the Google Scholar Results (all documents) taken on April 4th for all universities included in any of the Arab Region/MENA rankings produced by US News, QS or Times Higher Education. Time period was 2010-2014.
Rank | University | Country | Google Scholar Results |
---|---|---|---|
1 | Cairo University | Egypt | 19,100 |
2 | King Saud University | Saudi Arabia | 17,200 |
3 | King Abdulaziz University | Saudi Arabia | 15,800 |
4 | Ain Shams University | Egypt | 15,500 |
5 | American University of Beirut | Lebanon | 12,700 |
6 | King Abdullah University of Science and Technology | Saudi Arabia | 11,000 |
7 | Alexandria University | Egypt | 10,500 |
8 | University of Jordan | Jordan | 9,700 |
9 | Al Azhar University | Egypt | 9,480 |
10 | King Fahd University of Petroleum and Minerals | Saudi Arabia | 9,180 |
11 | American University in Cairo | Egypt | 9,080 |
12 | Mansoura University | Egypt | 8,790 |
13 | Zagazig University | Egypt | 8,410 |
14 | Assiut University | Egypt | 7,600 |
15 | Sultan Qaboos University | Oman | 7,410 |
16 | Kuwait University | Kuwait | 7,040 |
17 | University of Baghdad | Iraq | 6,960 |
18 | University of Khartoum | Sudan | 5,910 |
19 | Qatar University | Qatar | 5,430 |
20 | Suez Canal University | Egypt | 5,340 |
21 | Tanta University | Egypt | 5,030 |
22 | University of Sharjah | UAE | 5,020 |
23 | United Arab Emirates University | UAE | 4,690 |
24 | Helwan University | Egypt | 4,200 |
25 | King Khalid University | Saudi Arabia | 4,110 |
26 | Yarmouk University | Jordan | 4,100 |
27 | Jordan University of Science and Technology | Jordan | 4,040 |
28 | Taif University | Saudi Arabia | 3,530 |
29 | Minia University | Egypt | 3,500 |
30 | Benha University | Egypt | 3,400 |
31 | American University of Sharjah | UAE | 3,150 |
32 | University of Tunis | Tunisia | 3,130 |
33 | King Faisal University | Saudi Arabia | 3,090 |
34 | University of Mosul | Iraq | 2,820 |
35= | Qassim University | Saudi Arabia | 2,770 |
35= | Taibah University | Saudi Arabia | 2,770 |
37 | Hashemite University | Jordan | 2,650 |
38 | Umm Al-Qura University | Saudi Arabia | 2,610 |
39 | Université Abou Bekr Belkaid Tlemcen | Algeria | 2,370 |
40 | Al Balqa' Applied University | Jordan | 2,330 |
41 | University of Babylon | Iraq | 2,250 |
42 | South Valley University | Egypt | 2,240 |
43 | Lebanese American University | Lebanon | 2,170 |
44 | Menoufia University | Egypt | 2,120 |
45 | Damascus University | Syria | 2,060 |
46 | Fayoum University | Egypt | 1,970 |
47 | University of Bahrain | Bahrain | 1,940 |
48= | Khalifa University | UAE | 1,890 |
48= | Sohag University | Egypt | 1,890 |
50 | Beni-Suef University | Egypt | 1,770 |
51 | Jazan University | Saudi Arabia | 1,750 |
52 | Universite de Sfax | Tunis | 1,740 |
53 | Al Nahrain University | Iraq | 1,710 |
54 | Zayed University | UAE | 1,690 |
55 | University of Basrah | Iraq | 1,560 |
56 | King Saud bin Abdulaziz University for Health Sciences | Saudi Arabia | 1,520 |
57 | Masdar Institute of Science and Technology | UAE | 1,470 |
58 | Université Mohammed V Agdal | Morocco | 1,450 |
59 | Université d'Oran | Algeria | 1,370 |
60 | University of Damman | Saudi Arabia | 1,300 |
61 | Universite de Tunis El Manar | Tunisia | 1,270 |
62 | German University in Cairo | Egypt | 1,210 |
63 | Alfaisal University | Saudi Arabia | 1,200 |
64 | Universite des Sciences et da la Technologie Houari Boumediene | Algeria | 1,160 |
65 | Sanaa University | Yemen | 1,140 |
66 | Université Libanaise | Lebanon | 1,100 |
67 | Arabian Gulf University | Bahrain | 1,030 |
68 | Petroleum Institute Abu Dhabi | UAE | 948 |
69 | National Engineering School of Sfax | Tunisia | 872 |
70 | Mutah University | Jordan | 826 |
71 | Kafrelsheikh University | Egypt | 825 |
72 | Université de Batna | Algeria | 818 |
73 | Universite de Monastir | Tunisia | 757 |
74 | Université de Carthage (7 de Novembre) | Tunisia | 754 |
75 | Higher Colleges of Technology | UAE | 726 |
76 | University of Balamand | Lebanon | 673 |
77 | Beirut Arab University | Lebanon | 662 |
78 | Abu Dhabi University | UAE | 627 |
79 | Université Cadi Ayyad Marrakech | Morocco | 588 |
80 | Arab Academy of Science Technology and Maritime Transport | Egypt | 585 |
81 | Université Hassan II Mohammadia | Morocco | 559 |
82 | Universite de Sousse | Tunisia | 531 |
83 | Université Mentouri de Constantine | Algeria | 527 |
84 | Université Abdelmalek Essaadi | Morocco | 509 |
85 | University of Bejaia | Jordan | 508 |
86= | Petra University | Jordan | 506 |
86= | Université Sidi Mohammed Ben Abdellah | Morocco | 506 |
88 | Prince Sultan University | Saudi Arabia | 469 |
89 | Djillali Liabes University | Algeria | 464 |
90 | Université Ferhat Abbas Setif | Algeria | 462 |
91 | University of Dubai | UAE | 437 |
92 | Université Mouloud Mammeri de Tizi Ouzo | Algeria | 427 |
93 | Princess Sumaya University for Technology | Jordan | 417 |
94 | Université de la Manouba | Tunisia | 412 |
95 | Université Ibn Tofail Kénitra | Morocco | 403 |
96 | American University in Dubai | UAE | 364 |
97 | Al-Imam Mohamed Ibn Saud Islamic University | Saudi Arabia | 257 |
98 | Université Saint Joseph de Beyrouth | Lebanon | 246 |
99 | Université de Gabes | Tunisia | 208 |
100 | Université Mohammed Premier Oujda | Morocco | 193 |
101 | École Nationale Polytechnique d'Alger | Algeria | 184 |
102 | Ajman University of Science & Technology | UAE | 142 |
103 | Texas A&M Qatar | Qatar | 111 |
104 | UST d'Oran Mohamed Boudiaf | Algeria | 107 |
105 | Université Saad Dahlab Blida | Algeria | 70 |
106 | Université Badji Mokhtar de Annaba | Algeria | 54 |
QS Subject Rankings out on 29th
QS have redesigned their subject ranking methodology and will release the revised results on April 29th. The modifications are:
- reintroducing regional weightings in the academic and employer surveys
- ·counting articles in all the subjects to which they are assigned
- adjusting weightings usually to increase those for citations and h- index and reducing those for the surveys
- increasing the minimum number of papers required for inclusion in the rankings
- extending survey samples to 5 years.
Monday, April 20, 2015
Secrecy and Mystery: New Ranking Indicators
The South China Morning Post provides an explanation why some Chinese universities may not be doing well in the global rankings.
If you want to know the five most secret and mysterious universities in China, go here.
"If you are a student of science or engineering looking for a “thrilling” exchange programme in China, this may be the right place for you. The universities on the list below enjoy high reputations in China and they all accept overseas students, but they rarely publish papers in international journals due to the sensitivity of their research, so you may not easily find their names on most world university rankings. If you visit them, don’t be misled by the peaceful and friendly atmosphere on campus. The old professor riding a dusty bike with fatherly smile and silvery hair along your way to class could be chairing the development of China’s most deadly space weapons."
If you want to know the five most secret and mysterious universities in China, go here.
Friday, April 17, 2015
Ranking MENA Universities
Interest in the ranking of Arab and/or Middle Eastern and North African universities seems to be growing. There is definitely a feeling in many countries that higher education needs drastic reform and should open up to objective evaluation. Of course, there is also a realisation that rankings can be helpful for marketing and career advancement.
But there are problems. Most Arab universities do not have sufficient staff and organisation to provide adequate data about faculty and student numbers, let alone about things like employment of graduates or sources of income. Surveys are a possible source of information. for teaching quality but it is unlikely that at present they could produce reliable and accurate information.
The collection of data about research has probably reached the point where reasonable research rankings can be created for the Arab world and also for regions like South Asia, Southeast Asia and Africa. Unfortunately, the majority of Arab or Middle Eastern universities produce very little significant research so such research based rankings would probably be of relevance to about fifty institutions.
See Higher Education Strategy Associates for more discussion.
The US News has produced a variation on its new Best Global Universities rankings. This is based on research and citations data from the Scopus database. As a ranking of universities according to research output and impact, it looks quite plausible.
The top five universities are:
1. King Saud University, Saudi Arabia
2. King Abdulaziz University,Saudi Arabia
3. King Abdullah University of Science and Technology, Saudi Arabia
4. Cairo University, Egypt
5. American University of Beirut, Lebanon.
QS has produced rankings that are "a pilot version of the ranking that has been developed to reflect priorities and challenges for universities in the region."
The top five are:
1. King Fahd University of Petroleum and Minerals, Saudi Arabia
2. American University of Beirut, Lebanon
3. King Saud University, Saudi Arabia
4. American University of Cairo, Egypt
5. King Abdulaziz University, Saudi Arabia.
This is a bit different from the US News rankings but the QS table does include indicators like faculty student ration, web impact and international faculty and students.
Times Higher Education has produced what it calls a "snapshot", which is in fact only one indicator, citations per paper normalised by field and year. The top five are:
1. Texas A&M University Qatar
2. Lebanese American University
3. King Abdulaziz University,Saudi Arabia
4. Qatar University
5. American University in Beirut.
Texas A&M Qatar is a single subject branch campus which does not have a doctoral programme. It is difficult to believe that it will actually be in the full rankings that are scheduled for next year. THE's top thirty does not include Cairo University or King Abdullah University of Science and Technology.
As a simple check on the validity of these three rankings I have calculated their correlation with the results from a search of Google Scholar on the 4th of April. I have also included the scores for "publications" provided by THE next to their "snapshot" indicator.
There is a very strong correlation between the Google Scholar and the THE publication scores, a somewhat smaller correlation with the US News scores, which combine publication and citation data and a moderate correlation with the QS scores, which include web impact, internationalisation and institutional data.
There is no significant correlation with the THE citation indicator. In fact, this "snapshot" correlates with nothing, not even THE's own publication data.
Correlations
But there are problems. Most Arab universities do not have sufficient staff and organisation to provide adequate data about faculty and student numbers, let alone about things like employment of graduates or sources of income. Surveys are a possible source of information. for teaching quality but it is unlikely that at present they could produce reliable and accurate information.
The collection of data about research has probably reached the point where reasonable research rankings can be created for the Arab world and also for regions like South Asia, Southeast Asia and Africa. Unfortunately, the majority of Arab or Middle Eastern universities produce very little significant research so such research based rankings would probably be of relevance to about fifty institutions.
See Higher Education Strategy Associates for more discussion.
The US News has produced a variation on its new Best Global Universities rankings. This is based on research and citations data from the Scopus database. As a ranking of universities according to research output and impact, it looks quite plausible.
The top five universities are:
1. King Saud University, Saudi Arabia
2. King Abdulaziz University,Saudi Arabia
3. King Abdullah University of Science and Technology, Saudi Arabia
4. Cairo University, Egypt
5. American University of Beirut, Lebanon.
QS has produced rankings that are "a pilot version of the ranking that has been developed to reflect priorities and challenges for universities in the region."
The top five are:
1. King Fahd University of Petroleum and Minerals, Saudi Arabia
2. American University of Beirut, Lebanon
3. King Saud University, Saudi Arabia
4. American University of Cairo, Egypt
5. King Abdulaziz University, Saudi Arabia.
This is a bit different from the US News rankings but the QS table does include indicators like faculty student ration, web impact and international faculty and students.
Times Higher Education has produced what it calls a "snapshot", which is in fact only one indicator, citations per paper normalised by field and year. The top five are:
1. Texas A&M University Qatar
2. Lebanese American University
3. King Abdulaziz University,Saudi Arabia
4. Qatar University
5. American University in Beirut.
Texas A&M Qatar is a single subject branch campus which does not have a doctoral programme. It is difficult to believe that it will actually be in the full rankings that are scheduled for next year. THE's top thirty does not include Cairo University or King Abdullah University of Science and Technology.
As a simple check on the validity of these three rankings I have calculated their correlation with the results from a search of Google Scholar on the 4th of April. I have also included the scores for "publications" provided by THE next to their "snapshot" indicator.
There is a very strong correlation between the Google Scholar and the THE publication scores, a somewhat smaller correlation with the US News scores, which combine publication and citation data and a moderate correlation with the QS scores, which include web impact, internationalisation and institutional data.
There is no significant correlation with the THE citation indicator. In fact, this "snapshot" correlates with nothing, not even THE's own publication data.
Correlations
Google Scholar | USN | QS | THE "snapshot" | THE publications | |
---|---|---|---|---|---|
Google Scholar | .757** .000 87 | .570** .000 49 | .075 .692 30 | .852** .000 30 |
|
USN | .757** .000 87 | .419* .017 32 | .266 .170 28 | .869** .000 28 | |
QS | 570** .000 49 | .419* .017 32 | .267 .337 15 | .412 .128 15 | |
THE snapshot | .075 .692 30 | .266 .170 28 | .267 .337 15 | .109 .567 30 | |
THE publications | .852** .000 30 | .869** .000 28 | .412 .128 15 | .109 .567 30 |
Thursday, April 16, 2015
A Fictional, but Honest, College Rejection Letter
By Mimi Evans
Posted at Timothy MacSweeney's Blog
Posted at Timothy MacSweeney's Blog
"Dear Applicant,
The Admissions Committee has carefully considered your application and we regret to inform you that we will not be able to offer you admission in the entering class of 2015, or a position on one of our alternate lists. The applicant pool this year was particularly strong, and by that I mean the Admissions Committee once again sent candidates like you multiple enticing pamphlets encouraging you to apply, knowing full well we had no intention of accepting you.
However, you will be pleased to know that you have contributed to our declining admissions rate, which has helped our university appear exclusive. This allows us to attract our real candidates: upper-class kids and certified geniuses who will glean no new information from our courses or faculty, whose parents can incentivize us with a new swimming pool or lacrosse stadium."
Wednesday, April 15, 2015
Moscow Ranking seminar
A seminar, "Road to Academic Excellence: Russian Universities in Rankings by Subject" was held in Moscow on April 9th and 10th.
Tuesday, March 17, 2015
QS Subject Rankings Postponed
From the QS topuniversities site
The planned publication of the QS World University Rankings by Subject 2015 has been postponed for the next few weeks.
In 2015, we have introduced minor methodological refinements which have allowed for improved discrimination, particularly among specialist institutions that are now featuring more materially in our work. Our new subjects for this year include several - such as Veterinary Sciences, Art & Design, Architecture, Dentistry, Business & Management Studies – that are delivered by single-faculty institutions as well as large, comprehensive universities.
We approach our work with passion, dedication, integrity and a strong sense of responsibility. In response to feedback, we have decided to extend the consultation process, to fully articulate the methodological refinements of the QS World University Rankings by Subject.
Please check back on TopUniversities.com for details on the revised release date of the QS World University Rankings by Subject 2015.
Monday, March 16, 2015
Malaysia and the Rankings: The Saga Continues
Malaysia has had a long and complicated relationship with global rankings ever since that wonderful moment in 2004 when the Times Higher Education Supplement (THES) -- Quacquarelli Symonds (QS) World University Rankings, as they were then, put Universiti Malaya (UM), the country's oldest institution, in the top 100 universities of the world.
It turned out that UM was only in the top 100 because of a ridiculous error by data gatherers QS who counted ethnic Indians and Chinese as international and so boosted the score for the international faculty and international student indicators. This was followed in 2005 by the correction of the error, or "clarification of data" as THES put it, and UM's extraordinary fall out of the top 100, often explained by higher education experts as a change in methodology.
There was another fall in 2007 when QS introduced several methodological changes including the use of z scores, that is calibrating scores against the indicator means, and prohibiting survey respondents from voting for their own universities.
In 2009 UM made something of a recovery rising from 230th in the Times Higher Education (the Supplement bit was dropped in 2008) charts to 180th, largely because of an increase in the number of faculty and a reduction in the number of students reported to QS.
In 2010 THE and QS went their separate ways, publishing their own rankings with different methodologies. UM dropped out of the top 200 of the QS rankings but was back again in 2011 and has now reached 151st place. It has stayed clear of the THE World University Rankings, which require the annual resubmission of data.
Every time UM or any of the other Malaysian universities rises or falls, it becomes a political issue. Ascent is seen as proof of the strength of Malaysian higher education, decline is the result of policy failures.
Recently, the Malaysian second Minister for Education argued that Malaysian higher education was now world class and on a par with countries such as the United Kingdom, Germany and Australia because of its improved performance in the QS rankings and because it had attracted 135,000 foreign students.
Not everyone was impressed by this. Opposition MP Tony Pua criticised the reliance on the QS rankings, saying that they had been condemned by prominent academics such as Simon Marginson and that UM was not ranked by THE and performed much less well in the other rankings such as the Academic Ranking of World Universities.
The minister has riposted by noting that four Malaysian researchers were included in Thomson Reuters' list of influential scientific minds and that UM had been given five stars by QS.
So, who's right about Malaysian higher education?
First, Tony Pua is quite right about the inadequacies of the QS world rankings. It can be unstable since the number of universities included in the rankings changes from year to year and this can affect the scores for each indicator. Many of the scores for the objective indicators such as faculty student ratio and international students seem exaggerated and appear to have had a bit of massaging somewhere along the line.
The biggest problem with the QS rankings is the academic and employer reputation surveys. These collect data from a variety of sources, have low response rates and are very volatile. They include respondents whose names are submitted by universities and those who nominate themselves. There were some suspiciously high scores for the academic reputation indicator in 2014: Peking University in 19th place, National Taiwan University in 37th, University of Buenos Aires in 52nd and the Pontifical Catholic University of Chile in 78th.
The employer survey also produces some counter-intuitive results: Universitia Commerciale Luigi Bocconi in 33rd place, Indian Institute of Technology Bombay in 60th, the American University of Beirut in 85th and Universidiad de los Andes in 98th.
The QS world rankings can therefore be considered a poor reflection of overall quality.
Some critics have asserted that the THE rankings are superior and that Malaysian universities are being evasive by staying away from them. It is true that THE have won the approval of the British political and educational establishment. David Willetts, former universities and science minister, has joined the advisory board of THE's parent company. THE has highlighted comments on its recent reputation rankings by Greg Clark, universities, science and cities minister, Vince Cable, the Business Secretary and Wendy Piatt, director of the Russell Group of research intensive universities.
However, more informed and observers such as Alex Usher of Higher Education Strategy Associates and Isidro Aguillo of the Cybermetrics Lab have little regard for these rankings.
Even Simon Marginson, who has moved to the London Institute of Education, now accepts that they are "fatally flawed once outside the top 50 universities".
The THE rankings have some serious methodological flaws. They assign a 33% weighting to a reputation survey. After the top six universities the number of responses drops off rapidly. After we leave the top 100 the number of votes on the survey is small and so it is quite normal for a few additional responses to have a disproportionate effect on the indicator scores and consequently on overall scores. QS does give an even greater weighting for reputation -- 50% -- but reduces annual fluctuations by carrying over responses for a further two years if they are not updated. The new Best Global Universities produced by US News takes a five year average of their reputation scores.
In addition, the THE rankings assign a 30 % weighting to their Citations: Research Impact indicator which is constructed so that it allows contributions to publications, usually in physics , astronomy or medicine, with hundreds of contributing institutions to give a university an undeserved score for citations. Since its beginning, the THE rankings have shown bizarre results for research impact by putting places like Alexandria University, Moscow State Engineering Physics Institute, Federico Santa Maria Technical University in Valparaiso and the University of Marrakech Cadi Ayyad in the top ranks of the world for research impact.
Yes, QS putting Tsinghua University in overall 47th place is questionable (the Shanghai Academic Ranking of World Universities has it in 101-150 band) but on balance this is more plausible than putting, as THE does, Scuola Normale Superiore di Pisa in 63rd place (Shanghai has it in the 301-400 band)
The only situation in which it would make sense for UM to take part in the THE rankings would be if it was going to start a first rate particle physics programme including participation in the Large Hadron Collider project with multi-author publications that would bring in thousands of citations.
Rather than relying on the questionable QS and THE rankings, it would be a good idea to look at the progress of UM according to the less well known but technically competent research-based rankings. The Scimago Institution Rankings show that in a ranking of higher education institutions by research output UM was in 718th place in 2009. Since then it has risen to 266th place, behind 16 British (out of 189), 15 German and seven Australian universities and institutes.
This is similar to the CWTS Leiden Ranking which has UM in 270th place for number of publications (calculated with default settings) or the Shanghai rankings, where Nobel prizes are indicators, which place it in the 301-400 band.
This does not necessarily mean that there has been similar progress in graduate employability or in the quality of research. It does, however, mean that for research output Universiti Malaya, and maybe two or three other Malaysian universities, are now competitive with second tier universities in Britain and Germany.
This is probably not quite what most people mean by world-class but it is not impossible that in a decade UM could, if it sticks to current policies, be the rival of universities like Sheffield, Cardiff or Leeds.
But such progress depends on Malaysian universities focussing on their core missions and not falling into the quagmire of mission creep.
It also depends on something being done to remedy the very poor performance of Malaysian secondary schools. If that does not happen then the future of Malaysian higher education and the Malaysian economy could be very bleak.
It turned out that UM was only in the top 100 because of a ridiculous error by data gatherers QS who counted ethnic Indians and Chinese as international and so boosted the score for the international faculty and international student indicators. This was followed in 2005 by the correction of the error, or "clarification of data" as THES put it, and UM's extraordinary fall out of the top 100, often explained by higher education experts as a change in methodology.
There was another fall in 2007 when QS introduced several methodological changes including the use of z scores, that is calibrating scores against the indicator means, and prohibiting survey respondents from voting for their own universities.
In 2009 UM made something of a recovery rising from 230th in the Times Higher Education (the Supplement bit was dropped in 2008) charts to 180th, largely because of an increase in the number of faculty and a reduction in the number of students reported to QS.
In 2010 THE and QS went their separate ways, publishing their own rankings with different methodologies. UM dropped out of the top 200 of the QS rankings but was back again in 2011 and has now reached 151st place. It has stayed clear of the THE World University Rankings, which require the annual resubmission of data.
Every time UM or any of the other Malaysian universities rises or falls, it becomes a political issue. Ascent is seen as proof of the strength of Malaysian higher education, decline is the result of policy failures.
Recently, the Malaysian second Minister for Education argued that Malaysian higher education was now world class and on a par with countries such as the United Kingdom, Germany and Australia because of its improved performance in the QS rankings and because it had attracted 135,000 foreign students.
Not everyone was impressed by this. Opposition MP Tony Pua criticised the reliance on the QS rankings, saying that they had been condemned by prominent academics such as Simon Marginson and that UM was not ranked by THE and performed much less well in the other rankings such as the Academic Ranking of World Universities.
The minister has riposted by noting that four Malaysian researchers were included in Thomson Reuters' list of influential scientific minds and that UM had been given five stars by QS.
So, who's right about Malaysian higher education?
First, Tony Pua is quite right about the inadequacies of the QS world rankings. It can be unstable since the number of universities included in the rankings changes from year to year and this can affect the scores for each indicator. Many of the scores for the objective indicators such as faculty student ratio and international students seem exaggerated and appear to have had a bit of massaging somewhere along the line.
The biggest problem with the QS rankings is the academic and employer reputation surveys. These collect data from a variety of sources, have low response rates and are very volatile. They include respondents whose names are submitted by universities and those who nominate themselves. There were some suspiciously high scores for the academic reputation indicator in 2014: Peking University in 19th place, National Taiwan University in 37th, University of Buenos Aires in 52nd and the Pontifical Catholic University of Chile in 78th.
The employer survey also produces some counter-intuitive results: Universitia Commerciale Luigi Bocconi in 33rd place, Indian Institute of Technology Bombay in 60th, the American University of Beirut in 85th and Universidiad de los Andes in 98th.
The QS world rankings can therefore be considered a poor reflection of overall quality.
Some critics have asserted that the THE rankings are superior and that Malaysian universities are being evasive by staying away from them. It is true that THE have won the approval of the British political and educational establishment. David Willetts, former universities and science minister, has joined the advisory board of THE's parent company. THE has highlighted comments on its recent reputation rankings by Greg Clark, universities, science and cities minister, Vince Cable, the Business Secretary and Wendy Piatt, director of the Russell Group of research intensive universities.
However, more informed and observers such as Alex Usher of Higher Education Strategy Associates and Isidro Aguillo of the Cybermetrics Lab have little regard for these rankings.
Even Simon Marginson, who has moved to the London Institute of Education, now accepts that they are "fatally flawed once outside the top 50 universities".
The THE rankings have some serious methodological flaws. They assign a 33% weighting to a reputation survey. After the top six universities the number of responses drops off rapidly. After we leave the top 100 the number of votes on the survey is small and so it is quite normal for a few additional responses to have a disproportionate effect on the indicator scores and consequently on overall scores. QS does give an even greater weighting for reputation -- 50% -- but reduces annual fluctuations by carrying over responses for a further two years if they are not updated. The new Best Global Universities produced by US News takes a five year average of their reputation scores.
In addition, the THE rankings assign a 30 % weighting to their Citations: Research Impact indicator which is constructed so that it allows contributions to publications, usually in physics , astronomy or medicine, with hundreds of contributing institutions to give a university an undeserved score for citations. Since its beginning, the THE rankings have shown bizarre results for research impact by putting places like Alexandria University, Moscow State Engineering Physics Institute, Federico Santa Maria Technical University in Valparaiso and the University of Marrakech Cadi Ayyad in the top ranks of the world for research impact.
Yes, QS putting Tsinghua University in overall 47th place is questionable (the Shanghai Academic Ranking of World Universities has it in 101-150 band) but on balance this is more plausible than putting, as THE does, Scuola Normale Superiore di Pisa in 63rd place (Shanghai has it in the 301-400 band)
The only situation in which it would make sense for UM to take part in the THE rankings would be if it was going to start a first rate particle physics programme including participation in the Large Hadron Collider project with multi-author publications that would bring in thousands of citations.
Rather than relying on the questionable QS and THE rankings, it would be a good idea to look at the progress of UM according to the less well known but technically competent research-based rankings. The Scimago Institution Rankings show that in a ranking of higher education institutions by research output UM was in 718th place in 2009. Since then it has risen to 266th place, behind 16 British (out of 189), 15 German and seven Australian universities and institutes.
This is similar to the CWTS Leiden Ranking which has UM in 270th place for number of publications (calculated with default settings) or the Shanghai rankings, where Nobel prizes are indicators, which place it in the 301-400 band.
This does not necessarily mean that there has been similar progress in graduate employability or in the quality of research. It does, however, mean that for research output Universiti Malaya, and maybe two or three other Malaysian universities, are now competitive with second tier universities in Britain and Germany.
This is probably not quite what most people mean by world-class but it is not impossible that in a decade UM could, if it sticks to current policies, be the rival of universities like Sheffield, Cardiff or Leeds.
But such progress depends on Malaysian universities focussing on their core missions and not falling into the quagmire of mission creep.
It also depends on something being done to remedy the very poor performance of Malaysian secondary schools. If that does not happen then the future of Malaysian higher education and the Malaysian economy could be very bleak.
Saturday, March 14, 2015
Reactions to the Times Higher Education Reputation Rankings
Tsinghua, Peking climb up
Durham University rated among the world's top 100 by scholars
Australia's most respected university named
Delft drops down Times best reputation university rankings
Moscow State University and St. Petersburg State University rank among thetop 100 universities
HK universities suffer drop in world rankings
India fails to make it to world's top 100 reputed universities list
Oxbridge up in university rankings
National Taiwan University falls in world reputation rankings
Lomonosov University Brings Prestige to Russia in World Ranking
Johns Hopkins Was Ranked One of the Best Universities in the World. Again.
Thai universities not recognised in worldwide ranking of institutes
Saturday, February 28, 2015
Now an authority on cranberries
I just noticed that among the keywords that lead people to this blog are "asian sex diary" and "cranberry sciense paper".
The former probably has something to do with a post somewhere that mentioned gender bias in Asian universities, but cranberries?
The former probably has something to do with a post somewhere that mentioned gender bias in Asian universities, but cranberries?
Tuesday, February 24, 2015
The THE MENA Ranking: Interesting Results.
Times Higher Education (THE) has just published its ranking of universities in the MENA (Middle East and North Africa) region. It is, according to THE's rankings editor Phil Baty, "just a snapshot" of what MENA rankings might look like after consultation.with interested parties.
The ranking contains precisely one indicator, field-normalised citations, meaning that it is not the number of citations that matter but the number compared to the world average in specific fields. This was the flagship indicator in the THE world rankings and it is surprising that THE should continue using it in a region where it is inappropriate and produces extremely implausible results.
Number one in MENA is Texas A & M University at Qatar. This is basically an engineering school, evidently of a very high quality, and it is not clear whether is a genuinely independent institution. It offers undergraduate courses in engineering and has master's programmes in chemical engineering. Its output of research is meagre, as THE obligingly indicates in its press release.
How then did it get to the top of a research impact ranking? Easily. One of its faculty, with a joint appointment with the mother campus in Texas, is one of the collaborators on a multi-contributor paper emanating from CERN. I will leave it somebody else to count the number of contributors.
Another CERN collaborator, Cadi Ayyad University in Morocco is in sixth place. King Abdulaziz University is third.
There are ten Egyptian universities in the top thirty, including Alexandria but not Cairo.
The ranking contains precisely one indicator, field-normalised citations, meaning that it is not the number of citations that matter but the number compared to the world average in specific fields. This was the flagship indicator in the THE world rankings and it is surprising that THE should continue using it in a region where it is inappropriate and produces extremely implausible results.
Number one in MENA is Texas A & M University at Qatar. This is basically an engineering school, evidently of a very high quality, and it is not clear whether is a genuinely independent institution. It offers undergraduate courses in engineering and has master's programmes in chemical engineering. Its output of research is meagre, as THE obligingly indicates in its press release.
How then did it get to the top of a research impact ranking? Easily. One of its faculty, with a joint appointment with the mother campus in Texas, is one of the collaborators on a multi-contributor paper emanating from CERN. I will leave it somebody else to count the number of contributors.
Another CERN collaborator, Cadi Ayyad University in Morocco is in sixth place. King Abdulaziz University is third.
There are ten Egyptian universities in the top thirty, including Alexandria but not Cairo.
Saturday, February 21, 2015
Slipping down the curve
The ETS Center for Research on Human Capital and Education has produced an analysis of the performance of American millenials (young adults born after 1980 and aged 16-34 at the time of assessment) on the Programme for the International Assessment of Adult Competencies (PIAAC) conducted by the OECD. The analysis may be over-optimistic in places but in general it is a devastating forecast of a coming crisis for American higher education and very probably for American society.
American millenials are by historical and international standards well educated, at least in terms of the number of years of schooling but also, on average less literate, less numerate and less able to solve problems than their international counterparts. To be blunt, they appear to be relatively less intelligent.
Let's start with the literacy scores for adults aged 16 to 65 tested by PIAAC, that is basically the adults making up the current work force.
The average score for OECD countries is 273. There is nothing unusual at the top -- Japan 296, Finland 288, the Netherlands 284.
The score for the USA is 270, just below average.and better than six OECD countries. Overall the USA is at the moment mediocre compared with other developed nations.
Turning to numeracy, the OECD average is 269. Once again the top is dominated by East Asia and the shores of the Baltic and North Seas: Japan (288), Finland (282), Flanders (280), the Netherlands (280). The USA at 253 is well below average. Only Italy and Spain have lower scores.
For problem solving in technology-rich environments, the USA, with a score of 277 is again below the OECD average of 283.
This is the current work force, below average for literacy and problem solving, well below average for numeracy. It includes many who will soon die or retire and will be replaced by the millenial and post-millennial generations.
Take a look at the millenials. The gap is widening. For literacy the 6 point gap between the OECD average and the USA for 16-65 year olds becomes 8 points for the millennials.
For numeracy, the 13 point gap for 16 -65 year olds has become 21 points for the millenials and for problem solving 6 points becomes 9.
The situation becomes bleaker when we look at those who fail to meet minimum proficiency standards. Fifty percent of US milllenials score below literacy level 3, 64% below numeracy level 3, figures exceeded only by Spain, and 56% below level 2 proficiency in problem solving, the worst among developed countries reporting data.
Nor is there any hope that there may be a recovery from the younger section of the cohort, those aged between 16 and 24. The literacy gap remains the same at eight points but the numeracy and problem solving gaps each increase by an additional point.
The report also emphasises the large and increasing gap between the high and low skilled. Here there is a big danger. A gap can be closed from two ends and in the US it is easy to drag down high achievers by curtailing Advanced Placement programs, grade inflation, removal of cognitive content from college courses, group projects, holistic admissions and assessment and so on. The problem is that the closing of domestic gaps in this way just widens the international gap.
American millenials are by historical and international standards well educated, at least in terms of the number of years of schooling but also, on average less literate, less numerate and less able to solve problems than their international counterparts. To be blunt, they appear to be relatively less intelligent.
Let's start with the literacy scores for adults aged 16 to 65 tested by PIAAC, that is basically the adults making up the current work force.
The average score for OECD countries is 273. There is nothing unusual at the top -- Japan 296, Finland 288, the Netherlands 284.
The score for the USA is 270, just below average.and better than six OECD countries. Overall the USA is at the moment mediocre compared with other developed nations.
Turning to numeracy, the OECD average is 269. Once again the top is dominated by East Asia and the shores of the Baltic and North Seas: Japan (288), Finland (282), Flanders (280), the Netherlands (280). The USA at 253 is well below average. Only Italy and Spain have lower scores.
For problem solving in technology-rich environments, the USA, with a score of 277 is again below the OECD average of 283.
This is the current work force, below average for literacy and problem solving, well below average for numeracy. It includes many who will soon die or retire and will be replaced by the millenial and post-millennial generations.
Take a look at the millenials. The gap is widening. For literacy the 6 point gap between the OECD average and the USA for 16-65 year olds becomes 8 points for the millennials.
For numeracy, the 13 point gap for 16 -65 year olds has become 21 points for the millenials and for problem solving 6 points becomes 9.
The situation becomes bleaker when we look at those who fail to meet minimum proficiency standards. Fifty percent of US milllenials score below literacy level 3, 64% below numeracy level 3, figures exceeded only by Spain, and 56% below level 2 proficiency in problem solving, the worst among developed countries reporting data.
Nor is there any hope that there may be a recovery from the younger section of the cohort, those aged between 16 and 24. The literacy gap remains the same at eight points but the numeracy and problem solving gaps each increase by an additional point.
The report also emphasises the large and increasing gap between the high and low skilled. Here there is a big danger. A gap can be closed from two ends and in the US it is easy to drag down high achievers by curtailing Advanced Placement programs, grade inflation, removal of cognitive content from college courses, group projects, holistic admissions and assessment and so on. The problem is that the closing of domestic gaps in this way just widens the international gap.
Wednesday, February 18, 2015
Free Speech Ranking
As the First Armoured Division made its way across Libya towards Tunisia at the end of 1942 and in early 1943, the troops were kept busy with early morning PT, lectures on "my county" and "the Everest expedition" and debates on things like "the Channel Tunnel would be a benefit". In his diary, my father, then a humble signalman, recounted another debate on whether "permanent conscription is a national asset: "Horace as usual made a vociferous speech and he said, ' There, bang goes my two stripes'"
The spectacle of soldiers in the middle of a war arguing against government policy with no more penalty than forfeiting two stripes -- if, in fact, Horace ever did lose them -- sounds slightly surreal today. Especially so, now that western schools, universities and other organisations appear to be becoming more and more hostile to "dangerous" ideas, a category that seems to be expanding relentlessly.
The British online magazine Spiked has just published its first Free Speech University Rankings, which are worth reading in detail.
These are actually ratings, not rankings, and divide universities into three categories:
Just a few examples:
Birkbeck University Students Union has apparently banned UKIP, because "homophobia, Islamophobia, disablism, xenophobia, misogyny, racism, fascism, and general discrimination [sic!] is rife amongst its members, supporters, officials, and prospective candidates". If that wasn't bad enough, "John Sullivan, UKIP candidate for Forest of Dean and West Gloucestershire, said that regular physical exercise for boys released tension and thus avoided homosexuality."
The University of East London Students Union has banned materials opposing unrestricted abortion because "any material displayed in the Union building should adhere to the principle of ‘safe space’ and which resolves to ‘ensure an accessible environment in which every student feels comfortable, safe and able to get involved in all aspects of the organisation free from intimidation or judgement".
The University of Warwick noting the protected characteristics of "age, disability, gender reassignment, race, religion or belief, sex, sexual orientation, pregnancy and maternity and marriage or civil partnership" prohibits "displaying material that is likely to cause offence to others" or "spreading malicious rumours or insulting someone."
The spectacle of soldiers in the middle of a war arguing against government policy with no more penalty than forfeiting two stripes -- if, in fact, Horace ever did lose them -- sounds slightly surreal today. Especially so, now that western schools, universities and other organisations appear to be becoming more and more hostile to "dangerous" ideas, a category that seems to be expanding relentlessly.
The British online magazine Spiked has just published its first Free Speech University Rankings, which are worth reading in detail.
These are actually ratings, not rankings, and divide universities into three categories:
- Red: has actively banned and censored ideas on campus
- Amber: has chilled free speech through intervention
- Green: has a hand-off approach to free speech.
Just a few examples:
Birkbeck University Students Union has apparently banned UKIP, because "homophobia, Islamophobia, disablism, xenophobia, misogyny, racism, fascism, and general discrimination [sic!] is rife amongst its members, supporters, officials, and prospective candidates". If that wasn't bad enough, "John Sullivan, UKIP candidate for Forest of Dean and West Gloucestershire, said that regular physical exercise for boys released tension and thus avoided homosexuality."
The University of East London Students Union has banned materials opposing unrestricted abortion because "any material displayed in the Union building should adhere to the principle of ‘safe space’ and which resolves to ‘ensure an accessible environment in which every student feels comfortable, safe and able to get involved in all aspects of the organisation free from intimidation or judgement".
The University of Warwick noting the protected characteristics of "age, disability, gender reassignment, race, religion or belief, sex, sexual orientation, pregnancy and maternity and marriage or civil partnership" prohibits "displaying material that is likely to cause offence to others" or "spreading malicious rumours or insulting someone."
Monday, February 09, 2015
Affiliation in the News Again
Haaretz has published a story about Ariel University, in the occupied West Bank, suggesting that it is offering to pay researchers for adding its name to papers and grant proposals.
The report may be biased and the offer, which seems to apply to only one field, is probably an attempt to get round local and international ostracism. It is a much less blatant attempt to buy affiliations and therefore citations than the wholesale distribution of part time contracts by King Abdulaziz University in Jeddah to researchers on the Thomson Reuters Highly Cited lists.
Another case of affiliation abuse was that of Mohamed El Naschie, formerly editor of the journal Chaos, Solitons & Fractals, and writer of many articles that were cited frequently by himself and a few friends. El Naschie was also fond of giving himself affiliations that had little or no substance: Cambridge where he was a Visiting Scholar, allowed to use the library and other facilities, the University of Surrey for no discernible reason, and Alexandria University with which he had a tenuous connection.
Most of El Naschie's affiliation did not mean very much. Cambridge was getting lots of citations anyway and did not need him. But Alexandria University produced a modest amount of research and El Naschie's self-citations went a long way and took Alexandria into the top 200 of the THE 2010 world university rankings.
This sort of thing is likely to continue, especially since there is now a stream of papers and reviews in physics and sometimes in medicine and genetics that have hundreds of contributors and scores of contributing institutions. A part-time contract with a contributor to the Review of Particle Physics that includes adding the institution as a secondary affiliation could give an enormous boost to citation counts, especially if they are field and year- normalised..
It would be a good idea for academic editors and publishers to review their policies about the listing of affiliations. Perhaps second (or more) affiliations should only be allowed if documentary evidence of a significant connection is provided.
Likewise rankers ought to think about not counting secondary affiliations, as Shanghai Center for World Class Universities did last year, or giving them a reduced weighting.
The report may be biased and the offer, which seems to apply to only one field, is probably an attempt to get round local and international ostracism. It is a much less blatant attempt to buy affiliations and therefore citations than the wholesale distribution of part time contracts by King Abdulaziz University in Jeddah to researchers on the Thomson Reuters Highly Cited lists.
Another case of affiliation abuse was that of Mohamed El Naschie, formerly editor of the journal Chaos, Solitons & Fractals, and writer of many articles that were cited frequently by himself and a few friends. El Naschie was also fond of giving himself affiliations that had little or no substance: Cambridge where he was a Visiting Scholar, allowed to use the library and other facilities, the University of Surrey for no discernible reason, and Alexandria University with which he had a tenuous connection.
Most of El Naschie's affiliation did not mean very much. Cambridge was getting lots of citations anyway and did not need him. But Alexandria University produced a modest amount of research and El Naschie's self-citations went a long way and took Alexandria into the top 200 of the THE 2010 world university rankings.
This sort of thing is likely to continue, especially since there is now a stream of papers and reviews in physics and sometimes in medicine and genetics that have hundreds of contributors and scores of contributing institutions. A part-time contract with a contributor to the Review of Particle Physics that includes adding the institution as a secondary affiliation could give an enormous boost to citation counts, especially if they are field and year- normalised..
It would be a good idea for academic editors and publishers to review their policies about the listing of affiliations. Perhaps second (or more) affiliations should only be allowed if documentary evidence of a significant connection is provided.
Likewise rankers ought to think about not counting secondary affiliations, as Shanghai Center for World Class Universities did last year, or giving them a reduced weighting.
Saturday, February 07, 2015
Ranking Universities is a Really Serious Business
It seems that Webometrics has been hacked. Let's hope the problem is sorted out soon.
Ranking Web of Universities was attacked by external hackers. They did published hate messages and they had access to the Ranking, changing significantly the rank of at least one university and altering the structure and arrangement of the system. We are trying to fix the problems and sincerely apologize for any inconvenience. We hope to be able to be back in a few days with very exciting news and updated information. Thanks for your patience.
Ranking Web of Universities was attacked by external hackers. They did published hate messages and they had access to the Ranking, changing significantly the rank of at least one university and altering the structure and arrangement of the system. We are trying to fix the problems and sincerely apologize for any inconvenience. We hope to be able to be back in a few days with very exciting news and updated information. Thanks for your patience.
Wednesday, February 04, 2015
New York Times said it was a really big gap closing. It didn't seem so big then.
[Apologies to Bob Dylan, 'Talkin' New York']
The world of education is obsessed with gaps. Every
time the PISA results come out there is renewed concern about the stubborn and
growing gap between the United States and some Asian and Eastern European
countries although the US should perhaps be congratulated for every large ethnic
group doing as well or better than its international counterparts.
At the same time, there is recurrent anguish over the
failure of African Americans and Hispanics to match the academic achievements
of Whites and (East?) Asians.
According to the New York Times (NYT),
the huge achievement gap between wealthy and poor American children is a major cause of the mediocre performance of the American economy. Just closing the American gap and going up a few
points in the PISA rankings would apparently boost the economy significantly and create
billions of tax revenues.
So how to do this? The NYT reports that a recent
study by the Washington Center for Equitable Growth claims that things like more early
childhood education, reducing lead paint exposure and letting students
sleep a bit more will do the trick.
And has anyone managed to close the gap? Yes, according to the study, Montgomery County in Maryland, an affluent, racially mixed county near
Washington DC,
"was able to reduce the gap and increase scores after instituting all-day kindergarten programs, reducing class size, investing in teacher development and reducing housing -based segregation in its schools and a host of other reforms, Montgomery County, Maryland was successful in both improving average achievement test scores and reducing achievement gaps. The percentage of 5th graders reading at or above the proficient level on the Maryland State Assessment rose for all racial and ethnic groups between 2003 and 2009. In addition, gaps between the disproportionately lower-income black and Hispanic students and the disproportionately higher-income white and Asian students narrowed.”
So we should all go to Montgomery County
to find out how to close the gap, bring America up to OECD or even Finnish or Korean
standards and achieve Chinese rates of economic growth?
Perhaps not.
The good news from Montgomery was a
little surprising because I was sure that I had read a story that painted a
rather less cheerful picture of the school system there.
Here it is. From the Washington Post of March 12, 2013. 'In Montgomery schools, the achievement gap widens in some areas', by Donna St George
Locals were baffled how the school system could spend so much money and still do so badly.
Anyway, here are some extracts from the report from Montgomery County itself.
Here it is. From the Washington Post of March 12, 2013. 'In Montgomery schools, the achievement gap widens in some areas', by Donna St George
"The achievement gap that separates white and Asian students from black and Latino students has grown wider in Montgomery County in several measures of academic success, according to a report released Tuesday."
“The 130-page report points to progress in five of 11 performance indicators in recent years. The school system improved on gaps in school readiness and high school graduation, for example. But disparities widened in advanced-level scores for state math exams in third, fifth and eighth grades. There were mixed results in two categories.”
' “We still rank as one of the top spenders nationally in education, and then to lose ground is extremely concerning,” said Council Vice President Craig Rice (D-Upcounty), who called for more urgency. “It just boggles my mind that this can be so far below the radar.” '
But evidently the preferred solution is more money.
Montgomery Superintendent Joshua P. Starr said he agrees with most of the analysis. He wrote,
“much of the $10 million the school system is seeking above mandatory funding levels in its budget proposal would help address achievement disparities, including 30 “focus” teachers to reduce class sizes in English and math at middle and high schools where students are struggling."
Anyway, here are some extracts from the report from Montgomery County itself.
“This report finds that since 2008 MCPS has made progress, but significant achievement gaps remain, particularly among measures of at-risk academic performance. Over the same period, MCPS also lost ground in narrowing the achievement gap among several measures of above grade level performance that align with MCPS’ Seven Keys initiative and the Common Core State Standards.”
and
“MCPS narrowed the achievement gap across five measures: school readiness, MSA proficiency, suspensions, academic ineligibility, and graduation rates. These gaps narrowed by increasing the performance of most subgroups while accelerating the performance of the lowest performing subgroups."
and
“MCPS achieved mixed or no progress in narrowing the gap on two measures: dropout rates and completion of USM or CTE program requirements among graduates. For these two measures, MCPS tended to narrow the gap by race and ethnicity, but did not achieve the same progress among service groups. “and
“MCPS’ achievement gap widened across four measures: MSA advanced scores, Algebra 1 completion by Grade 8 with C or higher, AP/IB performance, and SAT/ACT performance. Among these four measures of above grade level performance that align with MCPS’ Seven Keys, high performing subgroups made greater gains on these benchmarks than low performing subgroups, thus widening the gap. More specifically: • The MSA Advanced Gaps in Grade 3 narrowed across most subgroups for reading by 2-7% but widened for math by 5-33% from 2007 to 2012; the Grade 5 gaps narrowed across most subgroups for reading by 2-16% but widened for math by 3-37%, and the Grade 8 gaps widened for both reading and math by 9-56%. • The Algebra 1 by Grade 8 with C or Higher Gap widened by 7-19% by race, ethnicity, special education, and FARMS status from 2010 to 2012, but narrowed by 7% by ESOL status. • The AP/IB Performance Gap among graduates widened by 6-37% by race, ethnicity, and service group status from 2007 to 2012. • The SAT/ACT Performance Gap among graduates held constant by special education and ESOL status from 2010 to 2012, but increased by race, ethnicity, and income by 3-6%."
So. Montgomery County reduced the gap for grade 5 reading for most subgroups but it widened for math. By Grade 8 it widened for reading and math, as did the AP/IB performance gap and the SAT/ACT gap for most groups.
When we get to the Center for Equitable growth and the New York Times, only the grade 5 reading improvement remains and the failures in other areas have disappeared. How very careless of them.
When we get to the Center for Equitable growth and the New York Times, only the grade 5 reading improvement remains and the failures in other areas have disappeared. How very careless of them.
Monday, February 02, 2015
Quality and Bias in University Rankings
I have just finished reading a very interesting unpublished paper, 'Measuring University Quality' by Christopher Claassen of the University of Essex.
He finds that all the major international rankings tap to some extent an underlying unidimensional trait of university quality and that this is measured more accurately by the US News Best Global Universities, the Center for World University Rankings (Jeddah) and the Academic Ranking of World Universities (Shanghai).
He also finds that these rankings are not biased towards their home countries, in contrast to the Times Higher Education, QS and Webometrics rankings.
He finds that all the major international rankings tap to some extent an underlying unidimensional trait of university quality and that this is measured more accurately by the US News Best Global Universities, the Center for World University Rankings (Jeddah) and the Academic Ranking of World Universities (Shanghai).
He also finds that these rankings are not biased towards their home countries, in contrast to the Times Higher Education, QS and Webometrics rankings.
Friday, January 30, 2015
Who says university isn't worth it?
In Malaysia it might be.
According to a local blog the customary dowry ("hantaran," distinct from the religiously sanctioned "mas kahwin" which is very modest) paid to the family of the bride varies significantly according to the bride's level of education.
For a woman with UPSR ( primary school certificate) it is 2-4,000 Ringgit.
For SPM (secondary school certificate) holders it is 4-8,000 Ringgit.
For STPM holders (equivalent to 'A' levels) it is 8-12,000 Ringgit.
For degree holders it is 12-15,000 Ringgit.
For master's holders it is 15-20,000 Ringgit.
For Ph Ds it is 20-30,000 Ringgit.
As far as I know, there is no premium for international universities or for those with a high place in the global rankings. Yet.
According to a local blog the customary dowry ("hantaran," distinct from the religiously sanctioned "mas kahwin" which is very modest) paid to the family of the bride varies significantly according to the bride's level of education.
For a woman with UPSR ( primary school certificate) it is 2-4,000 Ringgit.
For SPM (secondary school certificate) holders it is 4-8,000 Ringgit.
For STPM holders (equivalent to 'A' levels) it is 8-12,000 Ringgit.
For degree holders it is 12-15,000 Ringgit.
For master's holders it is 15-20,000 Ringgit.
For Ph Ds it is 20-30,000 Ringgit.
As far as I know, there is no premium for international universities or for those with a high place in the global rankings. Yet.
Subscribe to:
Posts (Atom)