Light Posting Ahead
For the next few weeks posting will be light as I am attending to family matters.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
"Any discussion of Malaysian tertiary educational policy needs to take into account the needs of national development in a specific and historical context. Recent debates in regard to the competitive position of Malaysian higher education globally is one area where the pressures of competition and liberalisation must be balanced by the interests of inclusion and social sustainability."
"The discourse of neo-liberal globalisation is itself still arguably beholden to what Syed Hussein Alatas critiqued as the discourse of “The Lazy Native”. Higher educational institutions’ commitment to inclusion and social justice is central to their merit in society."
The fractional counting method gives less weight to collaborative publications than to non-collaborative ones. For instance, if the address list of a publication contains five addresses and two of these addresses belong to a particular university, then the publication has a weight of 0.4 in the calculation of the bibliometric indicators for this university. The fractional counting method leads to a more proper normalization of indicators and to fairer comparisons between universities active in different scientific fields. Fractional counting is therefore regarded as the preferred counting method in the Leiden Ranking.
Clever people have figured out that there is a growing demand for outlets for scholarly work, that there are too few journals or other channels to accommodate all the articles written, that new technology has created confusion as well as opportunities, and (finally) and somewhat concerning is that there is money to be made in the knowledge communication business. As a result, there has been a proliferation of new publishers offering new journals in every imaginable field. The established for-profit publishers have also been purchasing journals and creating new ones so that they “bundle” them and offer them at high prices to libraries through electronic subscriptions.
The league tables show the percentage of 11-year-olds in each school reaching Level 4 – the standard expected for their age group – in both English and maths at primary school.Officially, this means they can spell properly, start to use grammatically complex sentences and employ joined up handwriting in English. In maths, they should be able to multiply and divide whole numbers by 10 or 100 and be able to use simple fractions and percentages.
Pupils exceeding this standard are awarded a higher Level 5.Data for individual schools also shows three other measures: average points score, value-added and pupil progress.
MIT expects that this learning platform will enhance the educational experience of its on-campus students, offering them online tools that supplement and enrich their classroom and laboratory experiences. MIT also expects that MITx will eventually host a virtual community of millions of learners around the world.
- organize and present course material to enable students to learn at their own pace
- feature interactivity, online laboratories and student-to-student communication
- allow for the individual assessment of any student’s work and allow students who demonstrate their mastery of subjects to earn a certificate of completion awarded by MITx
- operate on an open-source, scalable software infrastructure in order to make it continuously improving and readily available to other educational institutions.
It looks as though a two-tier international university ranking system is emerging.
At the top we have the 'big three', Shanghai's Academic Ranking of World Universities, the QS World University Rankings and, since 2010, the Times Higher Education World University Rankings.
These receive massive attention from the media, are avidly followed by academics, students and other stakeholders and are often quoted in promotional literature. Graduation from a university included in these has even been proposed as a requirement for immigration.
Then we have the rankings by SCImago and Webometrics, both from Spain, the Performance Ranking of Scientific Papers for World Universities produced by the Higher education Evaluation and Accreditation Council of Taiwan, and the Leiden Ranking, published by the Centre for Science and Technology Studies at Leiden University.
These rankings get less publicity but are technically very competent and in some ways more reliable than the better-known rankings.
Going through the comparison of the various methodologies, the report details what is actually measured, how the scores for indicators are measured, and how the final scores are calculated — and therefore what the results actually mean.
The first criticism of university rankings is that they tend to principally measure research activities and not teaching. Moreover, the ‘unintended consequences’ of the rankings are clear, with more and more institutions tending to modify their strategy in order to improve their position in the rankings instead of focusing on their main missions.
For some ranking systems, lack of transparency is a major concern, and the QS World University Ranking in particular was criticized for not being sufficiently transparent.
The report also reveals the subjectivity in the proxies chosen and in the weight attached to each, which leads to composite scores that reflect the ranking provider’s concept of quality (for example, it may be decided that a given indicator may count for 25% or 50% of overall assessment score, yet this choice reflects a subjective assessment of what is important for a high-quality institute). In addition, indicator scores are not absolute but relative measures, which can complicate comparisons of indicator scores. For example, if the indicator is number of students per faculty, what does a score of, say, 23 mean? That there are 23 students per faculty member? Or does it mean that this institute has 23% of the students per faculty compared with institutes with the highest number of students/faculty? Moreover, considering simple counts or relative values is not neutral. As an example, the Academic Ranking of World Universities ranking does not take into consideration the size of the institutions.
The EUA report makes several recommendations for ranking-makers, including the need to mention what the ranking is for, and for whom it is intended. Among the suggestions to improve the rankings, the following received the greatest attention from the audience:
- Include non-journal publications properly, including books, which are especially important for social sciences and the arts and humanities;
- Address language issues (is an abstract available in English, as local language versions are often less visible?);
- Include more universities: currently the rankings assess only 1–3% of the 17,000 existing universities worldwide;
- Take into consideration the teaching mission with relevant indicators.
Russia’s education ministry is currently drawing up a list of foreign universities whose qualifications will be recognized.
The list will include only universities located within the G8 countries that enter the top 300 in the Academic Ranking of World Universities or the QS World University Rankings. Officials say there will be around 300 institutions meeting the criteria.
The reform is intended to attract more students to take part in Russian MA and PhD programs.
He said that he would discuss his papers with fellow scientists, and only when he thought that they were of a sufficiently high standard would he publish them. "I am too arrogant and have too much self-respect to allow a bad paper to pass through," he said.
Prof El Naschie called one witness, Prof Otto Rossler - an honorary editor of Chaos, Solitons and Fractals.
He told the court that there was no-one who could peer review him, referring to Prof El Naschie, because "if you have something new to offer, peer review is dangerous", adding that in such cases "peer review delays progress in science".
Prof El-Naschie asked his witness whether he thought that his (Prof El Naschie) papers were of "poor quality".
Prof Rossler replied: "On the contrary, they were very important and will become more important in the future."
And he added: "You are the most hard-working and diligent scientists I have ever met."
PETALING JAYA: Malaysia has little to show for its universities despite spending more money on tertiary education than do many other countries.
Malaysian universities lag behind many counterparts in Asia, including those located in neighbouring countries like Thailand and Singapore, according to a World Bank report released today.
“While Malaysia spends slightly more than most countries on its university students, leading Malaysian universities perform relatively poorly in global rankings,” said the report, entitled Malaysia Economic Monitor: Smart Cities.
Citing the Quacquarelli Symonds (QS) World University Rankings 2010, it noted that Universiti Malaya (UM) was ranked 207th worldwide and 29th in Asia.
It also quoted a US News and World 2011 report on the World’s Best Universities, which put UM, Universiti Kebangsaan Malaysia, Universiti Sains Malaysia and Universiti Putra Malaysia at 167th, 279th, 335th and 358th place respectively.
Even more worrying, the World Bank report observed, was the “increasing gap” between Malaysia’s and Singapore’s universities.
It compared UM with the National University of Singapore (NUS), which QS cited as the leading university in Southeast Asia.
“The gap between UM and NUS has been high and generally increasing, especially in the sciences,” the report said.
According to the report, UM and NUS were on par when it came to science and technology in 2005. However, UM has lost out to NUS over the past six years.
The report also said many of Malaysia’s university graduates did not seem to have the skills that would help them get employment.