Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Monday, August 17, 2009
The CCAP (Center for College Affordability and Productivity)/Forbes rankings are rather different from the rest, being emphatically based on outcomes rather than spending.
One quarter of the weighting of these rankings is for student satisfaction, based on scores from the ratemyprofessors site, another quarter on graduate success derived from Who's Who in America and payscale.com, a quarter from current students success -- graduation rates and winners of national student awards, a fifth for the debt incurred by students and five per cent for faculty quality.
Richard Vedder the director of CCAP claims that the rankings are relatively difficult to manipulate. Up to a point this is true. I cannot see much that anyone could do about Who's Who. But if these rankings ever overtook the USNWR rankings there could well be a lot of fiddling with graduation rates and innovative financial aid packages .
Anyway, the overall top five are .
1. US Military Aacadamy
2. Princeton
3. Caltech
4. Williams College
5. Harvard
The top five best value colleges are
1. Berea College, Kentucky
2. New College of Florida
3. US Miltary Academy
4. US Air Force Academy
5. University of Wyoming
The top five national research universities are:
1. Princeton
2. Caltech
3. Harvard
4. Yale
5. Stanford
This is from The Morse Code
"It's getting very close to the launch of the new America's Best Colleges rankings. The 2010 edition will be published on Thursday, August 20, which is the day the new rankings go live on our website. The site will have the most complete version of the rankings, tables, and lists, plus extensive profiles on each school. The America's Best Colleges website also will have wide-ranging interactivity as well as a newly upgraded search feature to enable students and parents to find the school that best fits their needs.
These exclusive rankings will also be published in the magazine's September 2009 issue and in our newsstand guidebook, both of which will go on sale around August 20. The main rankings include the national universities, liberal arts colleges, master's universities, and baccalaureate colleges by region. In addition, there will be one new ranking to show which schools have the greatest "commitment to undergraduate teaching." For the second year in row, we will publish the very popular list of "Up-and-Coming Institutions"—the colleges making innovative improvements. In addition, we will have our third annual ranking of Historically Black Colleges and Universities. "
Wednesday, August 12, 2009
In 2005 Duke University made an impressive showing in the THES-QS World University Rankings largely because someone at Quacquarelli Symonds counted undergraduate students as faculty. (see post January 29, 2007)
Perhaps it was not really an error. It looks like at least one Duke professor is intent on handing over over her teaching duties to her students
Cathy Davidson, a Duke professor, has told us about her "innovative' grading policies.
"I loved returning to teaching last year after several years in administration . . . except for the grading. I can't think of a more meaningless, superficial, cynical way to evaluate learning in a class on new modes of digital thinking (including rethining [sic or perhaps not -- maybe she means making even less substantial] evaluation) than by assigning a grade. It turns learning (which should be a deep pleasure, setting up for a lifetime of curiosity) into a crass competition: how do I snag the highest grade for the least amount of work? how do I give the prof what she wants so I can get the A that I need for med school? That's the opposite of learning and curiosity, the opposite of everything I believe as a teacher, and is, quite frankly, a waste of my time and the students' time. There has to be a better way . . .
So, this year, when I teach "This Is Your Brain on the Internet," I'm trying out a new point system. Do all the work, you get an A. Don't need an A? Don't have time to do all the work? No problem. You can aim for and earn a B. There will be a chart. You do the assignment satisfactorily, you get the points. Add up the points, there's your grade. Clearcut. No guesswork. No second-guessing 'what the prof wants.' No gaming the system. Clearcut. Student is responsible.
And how to judge quality, you ask? Crowdsourcing. Since I already have structured my seminar (it worked brilliantly last year) so that two students lead us in every class, they can now also read all the class blogs (as they used to) and pass judgment on whether they are satisfactory. Thumbs up, thumbs down. If not, any student who wishes can revise. If you revise, you get the credit. End of story. Or, if you are too busy and want to skip it, no problem. It just means you'll have fewer ticks on the chart and will probably get the lower grade. No whining. It's clearcut and everyone knows the system from day one. (btw, every study of peer review among students shows that students perform at a higher level, and with more care, when they know they are being evaluated by their peers than when they know only the teacher and the TA will be grading). "
So, every class is led by two students. An A is awarded for showing up for class, doing the work and having it judged as satisfactory by classmates or revising it after being judged unsatisfactory.
If classes are led by students, who also chose the reading and writing assignments and evaluate class contributions, and work is graded by students, then what is Professor Davidson being paid for?
Another point. Professor Davidson claims that all studies show that students perform at a higher level when they know they are being evaluated by peers rather than only by a teacher and a teaching assistant. We could of course argue about whether every study shows this and what a higher level means. But note that the studies are comparing students graded by peers and, presumably, instructors with those graded only by teacher and TA. From what Professor Davidson tells us grading in her class is done only by students and therefore the results of such studies cannot be used to support her claims.
note -- acknowledgement to Durham-in Wonderland
Monday, August 10, 2009
The new Performance Ranking of Scientific papers for World Universities is out.
This is based on a variety of measures derived from Essential Science Indicators database. It is therefore more orientated towards quality than the THE-QS rankings which use the more comprehensive but less selective Scopus database.
These rankings may become more influential in the future so it might be worthwhile making a few comments. First, like nearly all rankings there is a bias towards the citation-heavy natural sciences. Second, it may be that the number of indicators, eight, is too many since some at least may simply be counting the same thing. Third, there is no attempt to measure anything other than research.
Still, the current rankings are important. Looking at the overall index, we find that Oxbridge and some of the Ivy League schools are definitely slipping, deprived of the support from the THE-QS academic survey and the aging or dead laureates of the Shanghai index. Cambridge is in 15th place, Yale 16th, Oxford 17th and Princeton 38th.
Here are the top five.
1. Harvard
2. Johns Hopkins
3. Stanford
4. University of Washington at Seattle
5. UCLA
I am wondering about the University of Washington, which is 16th in the Shanghai rankings and 59th in the THE-QS.
Sunday, August 09, 2009
It took a while for me to decide that this article by David G. Savage in the San Francisco Chronicle was not a parody. It is nonetheless worth reading carefully. Much of it will sound familiar to those who are aware of the ongoing debate about how university students and faculty should be selected.
The article begins:
"Justice Sonia Sotomayor will bring something new to the U.S. Supreme Court, far beyond her being its first Latina member."
And what will she bring? Savage approvingly lists the attributes that will justify her appointment to the Supreme Court.
- She will be the only judge whose first language is not English.
- She is diabetic.
- She grew up in a housing project where drugs and crime were more common than "Ivy League scholarly success".
- Her SAT scores were not very good but she managed to graduate first in her class at Princeton.
- "[She] is also a divorced woman with no children but a close relationship with an extended family.
"She is a modern woman with a nontraditional family," said Sylvia Lazos, a law professor at the University of Nevada at Las Vegas. "She is much more reflective of contemporary American society than the other justices, like Alito and Roberts."
She was referring to Chief Justice John Roberts and Justice Samuel Alito, both of whom are married and have two children. The court soon is expected to face a series of cases involving the legal rights of other nontraditional families with gay and lesbian couples. " - She has had trouble paying her mortgage and credit cards.
- She has been a prosecutor and a trial judge.
- She will be one of two minorities on the court, the other being Clarence Thomas, and the only one who supports Affirmative Action. Apparently Jews, Italians and WASPs are not minorities.
So Sotomayor is qualified for the highest judicial office in the United States because she is a speaker of English as a second language, a diabetic, not a good test taker but hard working, divorced, childless, a member of a recognised minority, a supporter of Affirmative Action and a poor financial manager.
The time will come, I suspect, when these will be essential qualifications for faculty positions in the US and elsewhere.
And will someone please explain to me why Sotomayor's childlessness is more reflective of contemporary American society than Roberts's and Alito's two children apiece. Or is Professor Lazos living in a parallel universe where the American fertility rate is zero?
Tuesday, August 04, 2009
From the Hawaii Star-Bulletin
David Ross, chairman of the University of Hawaii-Manoa Faculty Senate's Executive Committee, claims that the university's ranking performance means that they should not have to take a pay cut.
"Recently we heard the good news that the University of Hawaii Foundation had raised $330 million in charitable donations over a six-year period. What got less press attention was that the UH faculty had raised over $400 million in grant support, not over six years but in a single year. At the same time we learned that top UH executives, who earn mainly at or above the national average, were taking voluntary pay cuts by up to 10 percent, while lower-level executives would be cut 6 percent to 7 percent. Meanwhile, UH faculty, who (despite some recent raises) still earn well below our colleagues at peer institutions, are being asked to take a 15 percent cut. ...
By many independent measures, UH-Manoa remains one of the great universities in the world. We're one of only 63 public universities in the country with the highest Carnegie Foundation classification. The best-known international ranking of universities ranks us as tied for 59th in the Western Hemisphere.
These rankings are based on the quality of our faculty and programs, not our buildings or athletic records. At this level UHM is in intense competition for the best faculty, grants and students. It is not a coincidence that our successes in recent years, academic and financial, have followed the rebuilding of our faculty, both in size and in salary. We are worried that
decisions being made right now by the state and the system will not only undo the recent progress we have made, but cause irreversible harm to our competitive standing. We are already losing faculty, and the cuts will make it di cult to recruit outstanding new faculty members and the research programs that they can develop. Since university rankings are based primarily on a faculty's reputation and grants, our hard-earned status as a research-extensive university could fall into jeopardy. "
Sad, but there's an army of adjuncts and underemployed Ph Ds out there who would work at those reduced salaries or less and who are just as or better qualified.
From Newsweek
" Paradoxical as it seems, one of the former Soviet states with a more pragmatic approach is Russia itself. To be sure, the Kremlin's fear of a color revolution means that touchy political subjects aren't taught. And there are no U.S.-accredited universities or American academic programs. But Russian authorities have recently begun allowing universities to open up—even if that means greater exposure to outside ideas. Many Russian schools, for example, have started reviving academic exchanges with Western universities. Their motivation is simple: desperation. Last year, not a single Russian university made it into the top 100 of a world ranking put out by Quacquarelli Symonds, a U.S.-based compiler of international university standards. Even Moscow State University, the pride of Russia's education system, slid from 97th place in 2007 to 180th in 2008.
To stop the rot, last year Prime Minister Vladimir Putin founded two new universities, bankrolling them to the tune of $300 million. More important, "education policymakers gave a signal to Russian universities to quickly embrace all the most innovative international programs, and now nothing is stopping them from inviting or hiring as many U.S. professors as they can," says Andrei Volkov, an adviser to the minister of education and rector of Moscow's Skolkovo School of Management. Accordingly, Moscow University recently signed a cooperation deal with the State University of New York to share students and award joint diplomas, and 65 U.S. visiting professors are working in Moscow this year. Another joint agreement with the University of Southern California is due to be inked this fall. "
I wonder if somone should tell Putin that until last year three out of four university centers in the SUNY system were not even listed in the THE-QS rankings.
Sunday, August 02, 2009
This is from TES Connect via butterfiesandswheels
"Exams for an Evangelical Christian curriculum in which pupils have been taught that the Loch Ness monster disproves evolution and racial segregation is beneficial have been ruled equivalent to international A- levels by a UK government agency.
The National Recognition Information Centre (Naric), which guides universities and employers on the validity of different qualifications, has judged the International Certificate of Christian Education (ICCE) officially comparable to qualifications offered by the Cambridge International exam board.
Hundreds of teenagers at around 50 private Christian schools in Britain study for the certificates, as well as several home-educated students."
This is the interesting bit:
"Mrs Lewis [spokesperson for the International Certificate of Christian Education]had not noticed the Loch Ness monster claims, which she suggested may have been a “slip at the typewriter”, adding that the science curriculum had helped a student to gain a place to study natural sciences at Oxford University[4th in the world according to THE-QS."
This is from today's Observer
"Universities were yesterday embroiled in a furious row over dumbing down after a parliamentary inquiry revealed the number of first-class degrees had almost doubled in a decade. Amid the war of words, senior Tories vowed to publish data that they claimed would reveal the true value of degrees.
Different universities demand "different levels of effort" from students to get similar degrees, according to the report from the commons select committee on innovation, universities and skills, suggesting that top grades from some colleges were not worth the same as others. "
And what is the cause of this grade inflation?
"Gillian Evans, a lecturer in mediaeval theology at Oxford University and an expert in university regulation, attributed the rise to universities' desire to move up published league tables.
"I am quite sure the reason proportions have gone up is exactly the same as the reasons A-levels have gone up: it's straightforward grade inflation, chasing a place in league tables," she said."
Wednesday, July 29, 2009
Top USA and Canada
1. MIT
2. Harvard
3. Stanford
4. Berkeley
5. Cornell
Top Europe
1. Cambridge
2. Oxford
3. Federal Institute of Technology ETH Zurich
4. University College London
5. University of Helsinki
Top Oceana
1. Australian National University
2. University of Queensland
3. Monash University
4. University of Melbourne
5. University of Sydney
Top South East Asia
1. National University of Singapore
2. Prince of Songkhla University
3. Chulalongkorn University
4. Kasetsart Universiy
5. Mahidol University
Top Arab World
1. King Saud University
2. King Fahd University of Petroleum and Minerals
3. Imam Muhamed bin Saud University
4. King Faisal University
5. King Abdulaziz university
This is from Ranking Web of Universities
"The July edition of the Ranking Web of World Universities (http://www.webometrics.info) shows important news. Most of them are due to changes done to improve the academic impact of the open web contents and to reduce the geographical bias of search engines. As a result, the US universities still lead the Ranking (MIT with its huge Open Courseware is again the first, followed by Harvard, Stanford and Berkeley), but the digital gap with their European counterparts (Cambridge and Oxford are in the region’s top) has been reduced. Even more important, some of the developing countries institutions reach high ranks, especially in Latin America where the University of Sao Paulo (38th) and UNAM (44th) benefits from the increasingly interconnected Brazilian and Mexican academic webspaces.Several countries improves their performance including Taiwan and Saudi Arabia with strong web oriented strategies, Czech Republic (Charles), the leader for Eastern Europe, Spain (Complutense) and Portugal (Minho, Porto) with huge repositories and strong Open Access initiatives. Norway (NTNU, Oslo), Egypt could be also mentioned.On the other side, the underrated are headed by France, with a very fragmented system, Korea, whose student-oriented websites are frequently duplicated, New Zealand, India or Argentina.Africa is still monopolized by South African universities (Cape Town is the first, 405th), as well as Australian Universities are the best ranked for Oceania (Australian National University, 77th)Other well performing institutions include Cornell or Caltech in the USA, Tokyo (24th) Toronto (28th), Hong Kong (91st), or Peking (104th). On the contrary, in positions below expected we find Yale, Princeton, Saint Petersburg, Seoul and the Indian Institutes of Science or Technology."
Tuesday, July 28, 2009
The Payscale site has produced a ranking of American schools and colleges by the salaries that its graduates earn.
Here are the top five engineering colleges by median mid-career salary:
1. MIT
2. Harvey Mudd
3. Stanford
4. Bucknell University
5. Rensselaer Polytechnic Institute
the top five Ivy League schools;
1. Dartmouth College
2. Harvard
3. Princeton
4. Yale
5. University of Pennsylvania
the top five liberal arts colleges;
1. Colgate University
2. Bucknell University
3. Swarthmore University
4. Amherst College
5. Haverford College
and the top five state universities;
1. Berkeley
2. Colorado School of Mines
3. Georgia Institute of Technology
4. New Jersey Institute of technology
5. University of California at San Diego.
Wednesday, July 22, 2009
In THE-QS rankings universities get 10 per cent for the proportion of international students and international faculty. Does this measure say anything about the quality of a university?
Maybe. But one thing it says something about is simply the size of the country in which the university is located. There is a moderate negative correlation between the score for international faculty of the top 400 universities in the 2008 rankings and population of .332 and of .326 between international students and population.
This may help to explain why Hong Kong universities have been doing so well lately compared with those in Mainland China.
Sunday, July 19, 2009
L'Ecole des Mines de Paris has produced its third Professional Ranking of World Universities. This is based solely on the number of CEOs of Fortune's top 500 companies. The top 5 in order are Tokyo, Harvard, Stanford, Waseda and Seoul national universities. Five French schools are in the top 20 and in general France performs much better on these rankings than any other, which, one suspects, might be the whole point of the enterprise.
According to University World News
Interviewed in the online higher education publication Educpros, Nicolas Cheimanoff, director of studies of Mines Paris
Tech, explained the aims of the rankings: "In France we were challenged into taking action, to say we could not base arguments exclusively on the Shanghai ranking and construct higher education policy solely on this ranking."We wanted to show at an international level that France is a country where you can study. Our ranking gives visibility to a school, but also to the system of French higher education as a whole."Cheimanoff said Mines Paris Tech had been in contact with Professor Liu, originator of the Shanghai rankings, to suggest Jiao Tong should incorporate the Mines crterion. "He was a priori in favour but only if we included the academic careers of company heads since 1920 as he did for
the Nobel prizewinners. But that's totally impossible."
The Paris rankings do correlate quite well with others indicating they are measuring some sort of quality. However, the performance of French, Japanese and Korean schools may say more about the recruitment and immigration policies of their countries than anything else.
Also, one wonders whether producing the CEO of General Motors is indicative of the real quality of Duke and Harvard.
The frightening thing is that it probably is.
Monday, July 06, 2009
Another national ranking system is on the way.
The Commission on Higher Education (Ched) will come up with a ranking system of the best schools in specific fields of study or discipline, an official said today.
“We may come up (with the ranking system) within the
year,” said Ched executive director Julito Vitriolo said in a phone
interview. As of the moment, Vitriolo said Ched is now compiling the
licensure examination results on different fields of study in various colleges and universities nationwide.
See here for more.
Presentations from the fourth International Rankings Expert Group Conference in Astana, Kazakhstan, are available here.
Tuesday, June 30, 2009
Although the Shanghai rankings show a high correlation with other rankings (based on a tiny sample of US universities) the HEEACT rankings from Taiwan (Performance Ranking for Scientific Papers for World Universities) do somewhat better. The correlation with THE-QS is .740, the Shanghai ARWU .984, the USNWR America's best Colleges .711, Professional Ranking of World Universities .920 and the Center for College Affordability and Productivity .700.
All these rankings measure diffent things. The USNWR measures a variety of indicators related directly or indirectly to the quality of instruction, the CCAP is quite definitely a consumer-orintated ranking, the THES-QS World University Rankings are largely a measure of research performance (reputational survey, citations per faculty and student faculty ratio where researchers are counted in the faculty), the Professional Renking of World Universities counts CEOs of top companies while the Shanghai and Taiwan rankings focus entirely on research, mainly in the natural sciences.
The ability of the Taiwan rankings to predict scores on the other rankings suggests that underlying various measures of university quality is a single q factor, the average intelligence of its faculty. If there is one single number that would tell you about the general quality of a school than it would probably be the average IQ of the faculty, although performance on standardised tests, publications and citations (especially in the hard sciences) and postgraduate degrees might be goood proxies. The strength of the Taiwan rankings would be their focus on research productivity alone.
Incidently, if anyone from HEEACT reads this, please think of a new name for your rankings. PROSPWU is not exactly a memorable acronym.
Sunday, June 28, 2009
University rankings are popping up everywhere. So how do they compare with one another? One way is to check the correlation between the total scores of the rankings. Here, correlations have been calculated for the scores of ten US universities (every tenth university in the Shanghai rankings excluding those not in the THE-QS top 400).
It seems that the Shanghai ARWU is the most valid of five rankings Correlations for total scores are .796 with the THE-QS, .712 with the USNWR America's Best Colleges, .896 for the Professional Ranking of World Universities (Paris) and .628 for the Center for College Affordability and Productivity.
It looks like on the basis of this extremely small and unrepresentative sample that if you had to pick just one ranking to rely on then it would have to be the Shanghai ARWU.
Thursday, June 25, 2009
I have just returned from the International Rankings Expert Group’s fourth conference in Astana, Kazakstan. There were some positive developments at the conference but also a few disappointments.
Starting with the negative aspects, there seems to be a global trend to the proliferation of national rankings which are increasingly and unnecessarily detailed and which impose a serious burden on teachers and researchers.. A case in point is the new ranking produced for Kazakhstan which includes just about every variable imaginable from "the number of Dissertation Councils" to "the availabilty of medical centers, sport halls, preventoriums, recreation zones". Very few at the conference seemed aware of the backwash effect of the rankings boom as universities outside the top 500 create their own rankings or compete for irrelevant awards, medals or certificates. Drudges in the periphery of the world university system now face an endless round of form filling, office tidying, meetings, committees and professional development activities which make teaching difficult and genuine research, as opposed to research-like behaviour, close to impossible.
The European Union ranking project was presented in some detail but I suspect is going to make little impact since it appears largely concerned with making fine distinctions between the research capabilities of faculties and departments.
There was a presentation about the Lisbon project which proposed to ignore research altogether and measure teaching excellence. This is an interesting idea but it seems to miss two important points. One reason for emphasizing the measurement of research is that the qualities required for research, general cognitive ability, reading and writing skills, conscientiousness and interest in a subject also correlate to some extent with teaching ability, however that is measured. Also, the assumption that learning is dependent on teaching which in turn must be regulated by a centralized bureaucracy is surely false, at least for the more able students
Positive developments include a trend towards personalized rankings where consumers assign their own weighting to indicators. There is an interesting project under way in Taiwan.
Richard Vedder introduced a ranking that has the merit of being based largely on publicly accessible data. The basic idea is excellent but there are some issues to be dealt with. Using RateMyProfessors is not a bad way to assess the quality of teaching but to be really valid there needs to be some adjustment for the grades awarded by the instructor. Using the American Who’s Who is also potentially interesting – and could well be applied internationally -- but there are of course obvious issues of bias.
He also gave a presentation without using PowerPoint. I must remember that next time I fill in a form about my innovative teaching methodology.
One measure presented was to create an IREG seal of approval. The logo is ready. I am not sure though whether this is going to be effective.
Overall, the conference has strengthened my conviction that if ranking is to be done it should not be by administrators or businesses but by universities themselves.
Monday, June 22, 2009
Recently, the US News and World Report expanded its rankings portfolio to include the World’s 100 Best Universities. This turned out to be nothing more than the THE-QS World University Rankings with a rebranding for the US market. Now the USNWR has gone a step further and produced a list of the world’s top 400 universities along with sundry regional and subject rankings. Once again, this is the QS rankings with a new name.
This is no doubt a shrewd move for QS who are now marketing their rankings on both sides of the Atlantic and appear to be on the way to establishing a near monopoly over the international ranking business. It could, however, be risky for USNWR. People are bound to wonder why it should link up with a company that has a history of errors where American universities are concerned. In 2007 QS got their North Carolina business schools mixed up and as a result caused Fortune magazine to withdraw its business school rankings based on QS data. Will US students and stakeholders forgive the USNWR if its data includes things like a near zero for research for Washington University in St Louis or an unbelievably good score for Duke for student faculty ratio?
Tuesday, June 16, 2009
This is from GLOBALHIGHERED.
Finally the decision on who has won the European Commission’s million euro tender – to develop and test a global ranking of universities – has been announced.
The successful bid – the CHERPA network (or the Consortium for Higher Education and Research Performance Assessment), is charged with developing a ranking system to overcome what is regarded by the European Commission as the limitations of the Shanghai Jiao Tong and the QS-Times Higher Education schemes. The final product is to be launched in 2011.
CHERPA is comprised of a consortium of leading institutions in the field within Europe; all have been developing and offering rather different approaches to ranking over the past few years (see our earlier stories here, here and here for some of the potential contenders):CHE – Centre for Higher Education Development (Gütersloh, Germany)Center for Higher Education Policy Studies (CHEPS) at the University of Twente (Netherlands)Centre for Science and Technology Studies (CWTS) at Leiden University (Netherlands)Research division INCENTIM at the Catholic University of Leuven (Belgium)Observatoire des Sciences et des Techniques (OST) in ParisEuropean Federation of National Engineering Associations (FEANI)European Foundation for Management Development (EFMD)
Monday, June 15, 2009
One of the most dangersous things about university rankings is that they are becoming -- in parts of Asia at any rate -- symbols of national grandeur or decline, attracting almost as much public concern and interest as the World Cup.
Dr Hsu has an interesting post on the divergent histories of Singapore and Malaysia that contains this comment:
Incidentally, I think this university ranking [almost certainly he means THE-QS] can be taken as representative of everything comparative among the 2 countries.
Sunday, June 14, 2009
Publish and Pay
There is a growing trend towards open access academic publishing where researchers have to pay for publication. Open access is in principle a good idea but the idea of authors rather than subscribers footing the bill has its dangers.
Firstly, it poses a threat to new academic journals in emerging countries. There are, I suspect quite a few researchers who would find it more convenient to spend a few hundred dollars, especially if comes out of grant money, for speedy and "prestigious" international publication rather than writing for a local journal with limited impact.
Secondly, there is a definite threat to standards if criteria for publication are to relaxed or perhaps even abandoned altogether.
Recently, Philip Davis and Kent Anderson sent a totally nonsensical computer generated paper to the Open Information Science Journal. It was accepted, supposedly after peer review, with a request for the payment of $800 in author's fees. In this case, at least, the peer review process had apparently been dropped altogther.
For more information see The Scientist and the "authors'" blog, The Scholarly Kitchen.
In all fairness, it must be pointed out that another computer generated paper submitted to another journal run by the same company journal was rejected and that one reviewer at least figured out what was going on.
Still, this does have disturbing implications. If publication becomes influenced or even determined by ability to pay then we are heading for the complete corruption of the peer review system.
It would be a good idea if universities refused to consider articles in pay for publication journals as evidence for selection or promotion. Perhaps also, Scopus and other databases could list such journals in a separate category.
Anyway, here is an extract from the first paper:
"In this section, we discuss existing research into red-black trees, vacuum tubes, and courseware [10]. On a similar note, recent work by Takahashi suggests a methodology for providing robust modalities, but does not offer an implementation [9]."
Thursday, June 11, 2009
University rankings are popping everywhere now. It is time to start comparing them with each other. First, here are the number of results from a Yahoo! search using the official names of the rankings. In the lead is the THE-QS World University Rankings, followed by the USNWR America's Best Colleges. The Shanghai rankings have made a much smaller impact and the Webometrics rankings even less. No doubt a search in languages other than English would lead to different results as would a search using different names.
Still, it seems that in the webosphere THE-QS have a strong lead among the international rankings.
"World University Rankings" (THES-QS) 942,000
"America's Best Colleges" (US News and World Report) 892,000
"Times Good University Guide" (UK) 252,000
"Academic Ranking of World Universities" (Shanghai) 123,000
"Guardian Good University Guide" (UK) 87,300
"Maclean's University Rankings" (Canada) 14,200
"World University Ranking on the Web" (Webometrics) 9,890
"CHE/Die Zeit University Ranking" (Germany) 4,570
"Ranking of Scientific Papers for World Universities" (Taiwan) 2,120
Sunday, June 07, 2009
According to the QS.com ranking of Asian universities, the best university in Asia for Student/Faculty Ratio is "College of Medicine, Pochon Cha University". (it seems that it is actually Pochon CHA, with CHA being the name of a private medical conglomerate).
This is a little odd since the institution is clearly a single subject one and therefore presumably should not have been included in the rankings at all. This was the rationale for the University of California at San Francisco being removed after a brief appearance in the world rankings.
It is possible though that QS has different requirements for being included in the world and the regional rankings. If this is the case then countries can now use a new strategy for getting excellent scores in the rankings. Just designate medical schools or faculties as independent universities. They will get good scores for publications and citations since medical researchers tend to publish short articles that are cited more frequently and more quickly than in other disciplines and for student/faculty ratio since they have a lot of clinical faculty who can be added to the faculty totals.
It will be interesting to see how long Pochon CHA University remains in the Asian rankings or whether it will appear in the forthcoming world rankings.