Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, November 05, 2009
The latest edition of the Academic Ranking of World Universities published by Shanghai Jiaotong University contains few changes at the top. In the top 20 the only change is that Johns Hopkins and Tokyo swap the 19th and 20th places.
Further down is another matter.
I have counted six institutions that have fallen out of the top 500. They are:
University of Akron
University of Idaho
University of Tennessee Health Center
Medical College of Georgia
University of Maine at Orono
Mississipi State University
Sad news about Idaho, alma mater of Sarah Palin. No doubt this will be further ammunition for those who want to crow about the intellectual superiority of Joe Biden.
The American universities have been replaced by:
Universite Victor Segalen II Bordeaux, France
Swinburne University of Technology, Australia
Pompeu Fabra University, Spain
University of Santiago de Compostela, Spain
King Saud University, Saudi Arabia
University of Tehran, Iran
Kyungpook University, Korea
The trend is clear. The US, except perhaps for the West coast, is declining. The Mediterranean, Southwest Asia and the Pacific Rim are rising.
The recent conference in Shanghai highlighted the rise of King Saud University, largely accomplished by the recruitment of highly cited researchers, which was pretty much the strategy underlying the dramatic ascent of Hong Kong University of Science and Technology, and the University of Tehran, who showed a massive improvement in the number of publications in 2008.
Saturday, October 31, 2009
From Times Higher Education
"Times Higher Education has signed an agreement with Thomson Reuters, the world’s leading research-data specialist, to provide the data for its annual World University Rankings.
The magazine will develop a new rankings methodology in the coming months, in consultation with its readers, its editorial board of higher education experts and Thomson Reuters. Thomson Reuters will collect and analyse the data used to produce the rankings on behalf of Times Higher Education."
.........
"QS, which has collected and analysed the rankings data for the past six years, will no longer have any involvement with Times Higher Education’s World University Rankings."
Friday, October 23, 2009
International university rankings have been around long anough to show signs of long term trends. Making sense of the THE QS rankings is, however, complicated by frequent changes of methodologyand occasional errors. The Shanghai rankings seem to be another matter. There has only been one significant change in method, in 2004 when articles in Nature and Science were counted for five years rather than three. It should be possible then to determine some general trends in research performance from 2004 and 2008.
These rankings do not indicate the exact position of universities but place them within broad bands. This is understandable but rather pointless since positions can be calculated from the component indicators in less than half an hour.
If we compare the positions of various universities then some interesting changes begin to emerge .
Between 2004 and 2008 Chinese universities have advanced steadily. Peking from 296th to 241st, Tsinghua from 213rd to 203rd, Nanjing from 330rd to 292nd, University of Science and Technology China from 333rd to 243rd, Zhejiang from 350th to 226th, Fudan from 372nd to 325th and Jilin from 478th to 430th.
Shanghai Jiao Tong University itself rose from 461st to 257th.
In addition, seven Chinese universities entered the rankings between 2004 and 2008.
Taiwanese universities also rose: National Taiwan University from 174th to 164th, National Tsing Hua Univeristy from 362nd to 309th and National Cheng Kung University from 408th to 305th.
The picture for Hong Kong universities is mixed. The University of Hong Kong , the Chinese University of Hong Kiong and City Univeristy of Hong Kong went up but the Hong Kong University of Science and Technlogy and the Kong Kong Polytechnic University went down.
In a little while we shall see whether these trends continue.
Thursday, October 15, 2009
"Canadian universities among top 200 in the world should be supported to keep them there" from The Vancouver Sun
"To the unbiased observer, the THE-QS rankings appear to be designed to put colony-dominated UK institutions at the top, for what appear to be biased business-related reasons, and indeed, THE-QS puts 4 of the top 6 universities in the world in the United Kingdom. How convenient for the home field advantage! but scarcely science, and scarcely reliable."
from Law Pundit.
"UC Irvine’s status takes a hit in new ranking of the world’s top colleges and universities" from Orange County Register
"King Saud University, King Fahd University of Petrolem and Minerals Listed among World's Top" from Saudi Gazette
"As Asian neighbours gain academic clout, the Kingdom must establish clear targets for itself". John O'Leary in Phnom Penh Post
"Canberra still Home of Australia's Best Higher Education, ANU" NOWUC
Tuesday, October 13, 2009
According to university administrators, universities rise in the THE-QS rankings because of enlightened leadership, quality control exercises like key performance indicators, ISO compliance, professional development and so on, increasing the quantity and impact of research and internationalisation. When they fall it is, according to adminstrators, because of the manifest bias of the rankings or, according to disgruntled outsiders, beacause of adminsitrative deficiencies.
In this year's rankings, there have been quite a few substantial changes in both directions between 2008 and 2009. Here are some of the fortunate cases who experienced an improvement and some comments on what actually contributed to the changes.
University College London
Rose from 7th place (total score 98.1 ) to 4th (99), just behind Yale, largely because of an improvement of 2 points for the academic survey, which has a weighting of 40%.
Princeton
Rose from 12th (95.7) to 8th (96.6) mainly because of an improvement in the student faculty ratio from 75 to 82 despite falling on 3 other indicators.
University of Toronto
A big improvement from 41st (81.1) to 29th (85.3) largely due to a whopping improvement in the faculty student ratio from 18 to 63, counteracting a fall for citations per faculty from 100 to 74.
University of Alberta
Rose from 74th (72.9) to 59th (75.4). This was almost entirely because of an improvement on the recruiter review from 48 to 71 points.
University of Oslo
A spectacular ascent from 177th (57.5) to 101st (62.9) in which strong gains on academic suvey, recruiter review and faculty student ratio (weighting of 70%) outweighed losses for citations per faculty and internationalisation (weighting of 30%).
Pohang University of Science and Technology
Rose 50 places from 184th to 134th propelled by an improvement in the academic survey from 37 to 53 points.
Keio University
Another remarkable rise from 214th (53.0) to 142nd (61.6) resulting from an improvement of 6 points on the academic survey and 38 for faculty student ratio, tempered by a 9 point fall for citations per faculty.
Chulalongkorn University
An improvement of 28 places caused mainly by a rise of 10 points on the academic survey.
Yonsei University
Rose from 203rd ( 54.1) to 151st (60.3). An improvement of 20 points for the academic survey more than compensated for declines in 4 other indicators.
In general then, ascent and descent within these rankings depends to a very large extent on the academic survey and faculty student ratio followed by the recruiter review. Changes in citations per faculty and international faculty and students have little impact, at least in the short run.
Monday, October 12, 2009
I have hesitated about putting up this post since the missing rankings might be restored in a few days or even hours.
I am sure that many people have noticed that the pre-2009 THE-QS rankings can no longer be accessed at the topuniversities site and that the list of the top 400 universities now there only includes the total scores, not those for the indicators. The Times Higher Education site does have data on the indicators for 2009 and preceding years but only for the top 200 universities in each case.
This unfortunately means that it is impossible for readers to check the reasons for the rise or fall of universities between 2008 and 2009.
All is not lost. I have saved the page for 2008 that indicates the indicator scores for the top 400. Anyone interested in knowing the scores for a particular university in that year can just send a note via the comments sections.
Sunday, October 11, 2009
America Slips in Rankings of World Universities
Brown University Falls
The University of Southhamton is Now Even Better
ntu's Ranking Rises
Thursday, October 08, 2009
A standard feature of the annual release of the THE QS rankings is the chorus of derision that greets the fall of a Malaysian university or of congratulation for a rise.
This year Universiti Malaya (UM) rose 50 places from 230 to 180.
Acccording to the Vice-Chancellor it was because:
"The redefinition of key performance indicators for the academics and the new initiatives implemented in international networking, recruitment of international staff and students have produced a quick, positive impact,” he said."
According to Ben Sowter, QS's head of research,
" UM’s resurgence into the top 200 was clearly impressive.
“The apparent collective effort at the university to attract a greater proportion of international students suggests a progressive outlook,” he said in an e-mail interview."
But was this what actually happened?
First, between 2008 and 2009 UM dropped quite a bit on the academic survey from 64 out of 100 to 60 (top score is 100 and the mean is 50) and on the recruiter survey from 70 to 68. This may have been the result of a subtle change in the surveys that required respondents to type in the name of selected universities rather than clicking and dragging from a list. This could have worked to the disadvantage of less well known universities.
The fall on the surveys was almost exactly balanced by a rise in the score for international students from 46 to 65 and for international faculty from 63 to 72. The effect of the change in these indicators was reduced by the low weighting thet they receive.
There was also a slight improvement in citations per faculty from 20 to 21.
UM had an overall improvement from 50.8 to 56.5. This was almost entirely the result of a massive improvement in the faculty student ratio indicator, from 38 to 68, worth a 6 full points on the weighted total.
In 2008 UM,according to QS's Top Universities Guide, had a ratio of 14.8 students per faculty.
According to the QS topuniversities web site, the ratio has fallen to 8.9 this year. This appears to have been achieved by reducing the number of students by about 6,000 and by increasing the number of faculty by about 600.
Performance indicators may get UM into the Shanghai or Taiwan rankings but they were not relevant in this particular case.
This article by Kris Olds is worth reading. A couple of extracts:
"It seems as if the Times Higher has decided to allocate most of its efforts to promoting the creation and propagation of this global ranking scheme in contrast to providing detailed, analytical, and critical coverage of issues in the UK, let alone in the European Higher Education Area. Six steady years of rankings generate attention, advertising revenue, and enhance some aspects of power and perceived esteem. But, in the end, where is the Times Higher in analyzing the forces shaping the systems in which all of these universities are embedded, or the complex forces shaping university development strategies? Rather, we primarily seem to get increasingly thin articles, based on relatively limited original research, heaps of advertising (especially jobs), and now regular build-ups to the annual rankings frenzy. In addition, their partnership with QS Quacquarelli Symonds is leading to new regional rankings; a clear form of market-making at a new unexploited geographic scale. Of course there are some useful insights generated by rankings, but the rankings attention is arguably making the Times Higher lazier and dare I say, irresponsible, given the increasing significance of higher education to modern societies and economies."
.....
"The discourse of “international” is elevated here, much like it was in the last Research Assessment Exercise (RAE) in the UK, with “international” codeword for higher quality. But international is just that – international – and it means nothing more than that unless we assess how good they (international students and faculty) are, what they contribute to the educational experience, and what lasting impacts they generate."
Sunday, October 04, 2009
Only four days to go before the publication of the 2009 THE-QS World university rankings.
The rankings will be published here and here.
Here is a trailer from THE
Will anyone be able to topple Harvard from the top spot? Has Cambridge still got the edge over Oxford? Can any nation break through the UK-US dominance of the top 10?
The first and third events seem extremely unlikely without some unannounced change in methodology although by most objective indicators the University of Tokyo ought to have a good chance. So I suspect that THE is hinting that Oxford will overtake Cambridge.
Saturday, August 29, 2009
The list of the worlds most powerful computers includes a number operated by universities. Among those in the top 100 are:
University of Tennessee
King Abdullah University of Science and Technology, Saudi Arabia
University of Toronto
University of Tokyo
University of Tsukuba, Japan
University of Minnesota
University of Edinburgh
University of Southern California
Kyoto University
Moscow State University
Umea University, Sweden
Clemson University, USA
University of Bergen, Norway
Sunday, August 23, 2009
A recent post raised questions about what it takes to be an education professor in the US. However, a recent exchange in The Chronicle of Higher Education makes one wonder whether faculties of law are much better.
Nancy Lemon teaches Law at the University of California at Berkeley and is the author of a well known textbook on domestic violence. She has been taken to task by Christina Hoff Sommers of including errors in the textbook.
Lemon’s attempted rebuttal is interesting. Summers takes issue with her claim that a very large proportion of women admitted to hospital emergency rooms were victims of domestic violence. Lemon’s response is to cite figures that show that a large proportion of the women admitted because of violence were victims of domestic violence, apparently not realising that she is moving the argumentative goalposts quite a lot. Lemon also insists that March of Dimes, a well known charitable organization, had sponsored a study that showed that battered women were more likely to have miscarriages when in fact the organisation’s involvement was peripheral.
Lemon’s worst error was her solemn claim that the traditional chroniclers of Rome were totally accurate and that Romulus, when not busy being suckled by a wolf and watching birds, had promulgated a misogynist law allowing men to beat their wives, which "continued into England" It is bad enough that Lemon is totally credulous about traditional historians but that she has apparently never heard of the Anglo-Saxon conquest that removed Roman law from England and prevented any transmission into common law is remarkable.
Saturday, August 22, 2009
Outcomes-based Education (OBE) is sweeping across Australia, the UK, Malaysia and other countries. I am sure that OBE is not the whole story, but if this news from South Africa is any guide the results are likely to be very negative.
"South African vice-chancellors warned the government last week to expect more students to drop out, as the shocking results of pilot national benchmark tests revealed that only 7% of first-year students are proficient in mathematics, only a quarter are fully quantitatively literate and fewer than half have the academic literacy skills needed to succeed without support.
.............
SOUTH AFRICA: Shocking results from university tests
Karen MacGregor16 August 2009 Issue: 0035
South African vice-chancellors warned the government last week to expect more students to drop out, as the shocking results of pilot national benchmark tests revealed that only 7% of first-year students are proficient in mathematics, only a quarter are fully quantitatively literate and fewer than half have the academic literacy skills needed to succeed without support."The challenge faced by higher education institutions in relation to mathematics is clearly enormous," according to a draft report produced for the vice-chancellors' association Higher Education South Africa (HESA) by the National Benchmark Tests Project."With the current emphasis on the production of graduates in scarce skills areas such as engineering and science, the need for curriculum responsiveness and remediation in this area is urgent," said the report, obtained by University World News, which is still to be considered by HESA.Last week HESA chairman, Professor Theuns Eloff, told parliament's higher education committee that most first-year students could not adequately read, write or comprehend - and universities that conduct regular competency tests have reported a decline in standards.While undergraduate enrolments had been growing by about 5% a year, and black students now comprised 63% of enrolment, there was concern about high drop out (around 50%) and low graduation rates, especially among black students. Only a third of students obtain their degrees within five years.HESA's findings from the benchmark project make it clear that South Africa's school system is continuing to fail its pupils and the country, and that universities will need to do a lot more to tackle what appear to be growing proficiency gaps.One reason for declining educational performance, Eloff argued, was flaws in the country's outcomes based education system. "You don't learn to spell and comprehend, and that's nonsense," he said. The Times newspaper commented: "So far, the only outcome from the outcomes-based education system is university students who can't read and write." "
This is the title of an article by Michael Sauder and Wendy Nelson Espeland in the American Sociological Review. The Foucaultian jargon is not to my taste but underneath it there is a sensible and data-backed article on the prevasive and negative effects of rankings on US law schools.
Here is the abstract:
"The Discipline of Rankings: Tight
Coupling and Organizational Change
Michael Sauder Wendy Nelson Espeland
University of Iowa Northwestern University
This article demonstrates the value of Foucault’s conception of discipline for
understanding organizational responses to rankings. Using a case study of law schools,
we explain why rankings have permeated law schools so extensively and why these
organizations have been unable to buffer these institutional pressures. Foucault’s
depiction of two important processes, surveillance and normalization, show how
rankings change perceptions of legal education through both coercive and seductive
means. This approach advances organizational theory by highlighting conditions that
affect the prevalence and effectiveness of buffering. Decoupling is not determined solely
by the external enforcement of institutional pressures or the capacity of organizational
actors to buffer or hide some activities. Members’ tendency to internalize these
pressures, to become self-disciplining, is also salient. Internalization is fostered by the
anxiety that rankings produce, by their allure for the administrators who try to
manipulate them, and by the resistance they provoke. Rankings are just one example of
the public measures of performance that are becoming increasingly influential in many
institutional environments, and understanding how organizations respond to these
measures is a crucial task for scholars.
Thursday, August 20, 2009
The nomination of Sonia Sotomayor to the Supreme Court of the United States has focused attention on two cases that came before her, both involving people afflicted with dyslexia.
The better known of these is that of Frank Ricci, a New Haven, Connecticut fireman denied promotion because not enough members of racial minorities were able to pass the firefighters’ test along with him.
The other involved Marilyn Bartlett who wanted to be a lawyer. Before attempting to switch professions she had a notably successful academic career, earning her first degree in Education from the State College at Worcester, Massachusetts, her master’s in Special Education from Boston University (84th in the 2008 Shanghai rankings) and a Ph.D. in Educational Administration from New York University (32nd). She has taught English in Germany and has been an Associate Professor of Educational Leadership at the University of South Florida (201-302) and Director of Graduate Studies as well as Chair and Associate Professor in the Department of Educational Leadership and Technology at the New York Institute of Technology. She is now Dean of the School of Education at Texas A & M University - Kingsville.
She has also been a law clerk, an assistant superintendent of schools and a special education coordinator
We are further told that:
“She has seven articles in progress and has published numerous articles in encyclopedias, proceedings, periodicals, book chapters and reports. Her first book – an examination of education law in Florida – is due to be published this fall.
In 2006, she received the Teaching Excellence Award from the College of Education at the University of Florida St. Petersburg and in 1999, Bartlett received the Lifetime Achievement Award presented by LD Access, a foundation focused on needs of learning disabled adults.
Bartlett is a member of the American Education Research Association, the American Association for School Administrators, the Educational Law Association and the National Council of Professors of Educational Administrators.”
However, three students on ratemyprofessors speak of her in less than glowing terms: “not very student friendly”, “horrible and incompetent” and "incompetent and not up to date on the educational needs".
After taking a degree in law at Vermont Law School, Bartlett took the New York bar exam four times and failed on each occasion. She then requested special accommodation because of a claimed disability, dyslexia. She was allowed to use a computer, to have an assistant to read the answers and to have 50 per cent extra time. However, she still failed.
The case then came before Sotomayer and evidence was presented that
“The effect of plaintiff's reading impairment on her life, even with all of her self-accommodations, is profound. Cf. 29 C.F.R. Pt. 1630, App. A § 1630.2(j) ("The determination of whether an individual has a disability is . . . based on . . . the effect of that impairment on the life of the individual."). Plaintiff has difficulty with tasks that most people perform effortlessly, including reading short e-mails, using a telephone directory or electronic database, writing a shopping list, or following a recipe. (Bartlett Aff. PP 11, 12, 13, 22.) Plaintiff generally avoids reading any unnecessary material and does not read for pleasure. (Bartlett Aff. PP9, 10, 14.) As plaintiff and her experts stated, plaintiff consistently tries to find alternative routes around reading. Dr. Hagin [*119] testified that based on her experience, plaintiff's "reading was more limited than the average person I might see, even the average person with a learning disability." (Tr. at. 163.) “
However, even with the accommodations mandated by Judge Sotomayor, Bartlett could not pass the bar exam and apparently has now stopped trying
I am surely not alone in wondering about the common sense involved in requiring somebody to demonstrate serious incompetence in a key professional skill so that they may be assisted to gain entrance to a profession. Perhaps a lawyer can explain why people should not be allowed to show extreme cowardice or pyrophobia to be fast tracked into a fire department or serious myopia to become an airline pilot.
What I am concerned with here is what the case says about basic academic standards at American schools of education.
I assume that Bartlett really is dyslexic, although faking is apparently not impossible, indeed not uncommon, and that the accommodations granted during graduate school and after were not a substitute for intelligence but devices necessary to allow it to function. (G. H. Harrison, M.J. Edwards, K. C. H. Parker, Identifying students feigning dyslexia: Preliminary findings and strategies fordetection. Dyslexia 14/3, 228-246)
We still have to explain how it is possible for someone who cannot acquire the basic knowledge or skills to enter the legal profession can not only complete a doctoral degree and therefore be certified as an authority on education but go on to become a recognized academic leader.
The only answer I can think of is that the minimum intellectual ability required to start a legal career are very much higher than those needed to rise to the top in education.
Rankings Frenzy 09 in Inside Higher Ed
Elyse Ashburn in the Chronicle of higher Education
Bob Morse on Makiing Sure the Coillege Data Are Correct in Morse Code
The US News and WorldReport's Best Colleges 2010 is out. The top ten national universities are :
1 = Harvard and Princeton
3. Yale
4 = Caltech, MIT, Stanford, University of Pennsylvania
8 = Columbia, Chicago
10. Duke
Monday, August 17, 2009
The CCAP (Center for College Affordability and Productivity)/Forbes rankings are rather different from the rest, being emphatically based on outcomes rather than spending.
One quarter of the weighting of these rankings is for student satisfaction, based on scores from the ratemyprofessors site, another quarter on graduate success derived from Who's Who in America and payscale.com, a quarter from current students success -- graduation rates and winners of national student awards, a fifth for the debt incurred by students and five per cent for faculty quality.
Richard Vedder the director of CCAP claims that the rankings are relatively difficult to manipulate. Up to a point this is true. I cannot see much that anyone could do about Who's Who. But if these rankings ever overtook the USNWR rankings there could well be a lot of fiddling with graduation rates and innovative financial aid packages .
Anyway, the overall top five are .
1. US Military Aacadamy
2. Princeton
3. Caltech
4. Williams College
5. Harvard
The top five best value colleges are
1. Berea College, Kentucky
2. New College of Florida
3. US Miltary Academy
4. US Air Force Academy
5. University of Wyoming
The top five national research universities are:
1. Princeton
2. Caltech
3. Harvard
4. Yale
5. Stanford
This is from The Morse Code
"It's getting very close to the launch of the new America's Best Colleges rankings. The 2010 edition will be published on Thursday, August 20, which is the day the new rankings go live on our website. The site will have the most complete version of the rankings, tables, and lists, plus extensive profiles on each school. The America's Best Colleges website also will have wide-ranging interactivity as well as a newly upgraded search feature to enable students and parents to find the school that best fits their needs.
These exclusive rankings will also be published in the magazine's September 2009 issue and in our newsstand guidebook, both of which will go on sale around August 20. The main rankings include the national universities, liberal arts colleges, master's universities, and baccalaureate colleges by region. In addition, there will be one new ranking to show which schools have the greatest "commitment to undergraduate teaching." For the second year in row, we will publish the very popular list of "Up-and-Coming Institutions"—the colleges making innovative improvements. In addition, we will have our third annual ranking of Historically Black Colleges and Universities. "
Wednesday, August 12, 2009
In 2005 Duke University made an impressive showing in the THES-QS World University Rankings largely because someone at Quacquarelli Symonds counted undergraduate students as faculty. (see post January 29, 2007)
Perhaps it was not really an error. It looks like at least one Duke professor is intent on handing over over her teaching duties to her students
Cathy Davidson, a Duke professor, has told us about her "innovative' grading policies.
"I loved returning to teaching last year after several years in administration . . . except for the grading. I can't think of a more meaningless, superficial, cynical way to evaluate learning in a class on new modes of digital thinking (including rethining [sic or perhaps not -- maybe she means making even less substantial] evaluation) than by assigning a grade. It turns learning (which should be a deep pleasure, setting up for a lifetime of curiosity) into a crass competition: how do I snag the highest grade for the least amount of work? how do I give the prof what she wants so I can get the A that I need for med school? That's the opposite of learning and curiosity, the opposite of everything I believe as a teacher, and is, quite frankly, a waste of my time and the students' time. There has to be a better way . . .
So, this year, when I teach "This Is Your Brain on the Internet," I'm trying out a new point system. Do all the work, you get an A. Don't need an A? Don't have time to do all the work? No problem. You can aim for and earn a B. There will be a chart. You do the assignment satisfactorily, you get the points. Add up the points, there's your grade. Clearcut. No guesswork. No second-guessing 'what the prof wants.' No gaming the system. Clearcut. Student is responsible.
And how to judge quality, you ask? Crowdsourcing. Since I already have structured my seminar (it worked brilliantly last year) so that two students lead us in every class, they can now also read all the class blogs (as they used to) and pass judgment on whether they are satisfactory. Thumbs up, thumbs down. If not, any student who wishes can revise. If you revise, you get the credit. End of story. Or, if you are too busy and want to skip it, no problem. It just means you'll have fewer ticks on the chart and will probably get the lower grade. No whining. It's clearcut and everyone knows the system from day one. (btw, every study of peer review among students shows that students perform at a higher level, and with more care, when they know they are being evaluated by their peers than when they know only the teacher and the TA will be grading). "
So, every class is led by two students. An A is awarded for showing up for class, doing the work and having it judged as satisfactory by classmates or revising it after being judged unsatisfactory.
If classes are led by students, who also chose the reading and writing assignments and evaluate class contributions, and work is graded by students, then what is Professor Davidson being paid for?
Another point. Professor Davidson claims that all studies show that students perform at a higher level when they know they are being evaluated by peers rather than only by a teacher and a teaching assistant. We could of course argue about whether every study shows this and what a higher level means. But note that the studies are comparing students graded by peers and, presumably, instructors with those graded only by teacher and TA. From what Professor Davidson tells us grading in her class is done only by students and therefore the results of such studies cannot be used to support her claims.
note -- acknowledgement to Durham-in Wonderland
Monday, August 10, 2009
The new Performance Ranking of Scientific papers for World Universities is out.
This is based on a variety of measures derived from Essential Science Indicators database. It is therefore more orientated towards quality than the THE-QS rankings which use the more comprehensive but less selective Scopus database.
These rankings may become more influential in the future so it might be worthwhile making a few comments. First, like nearly all rankings there is a bias towards the citation-heavy natural sciences. Second, it may be that the number of indicators, eight, is too many since some at least may simply be counting the same thing. Third, there is no attempt to measure anything other than research.
Still, the current rankings are important. Looking at the overall index, we find that Oxbridge and some of the Ivy League schools are definitely slipping, deprived of the support from the THE-QS academic survey and the aging or dead laureates of the Shanghai index. Cambridge is in 15th place, Yale 16th, Oxford 17th and Princeton 38th.
Here are the top five.
1. Harvard
2. Johns Hopkins
3. Stanford
4. University of Washington at Seattle
5. UCLA
I am wondering about the University of Washington, which is 16th in the Shanghai rankings and 59th in the THE-QS.
Sunday, August 09, 2009
It took a while for me to decide that this article by David G. Savage in the San Francisco Chronicle was not a parody. It is nonetheless worth reading carefully. Much of it will sound familiar to those who are aware of the ongoing debate about how university students and faculty should be selected.
The article begins:
"Justice Sonia Sotomayor will bring something new to the U.S. Supreme Court, far beyond her being its first Latina member."
And what will she bring? Savage approvingly lists the attributes that will justify her appointment to the Supreme Court.
- She will be the only judge whose first language is not English.
- She is diabetic.
- She grew up in a housing project where drugs and crime were more common than "Ivy League scholarly success".
- Her SAT scores were not very good but she managed to graduate first in her class at Princeton.
- "[She] is also a divorced woman with no children but a close relationship with an extended family.
"She is a modern woman with a nontraditional family," said Sylvia Lazos, a law professor at the University of Nevada at Las Vegas. "She is much more reflective of contemporary American society than the other justices, like Alito and Roberts."
She was referring to Chief Justice John Roberts and Justice Samuel Alito, both of whom are married and have two children. The court soon is expected to face a series of cases involving the legal rights of other nontraditional families with gay and lesbian couples. " - She has had trouble paying her mortgage and credit cards.
- She has been a prosecutor and a trial judge.
- She will be one of two minorities on the court, the other being Clarence Thomas, and the only one who supports Affirmative Action. Apparently Jews, Italians and WASPs are not minorities.
So Sotomayor is qualified for the highest judicial office in the United States because she is a speaker of English as a second language, a diabetic, not a good test taker but hard working, divorced, childless, a member of a recognised minority, a supporter of Affirmative Action and a poor financial manager.
The time will come, I suspect, when these will be essential qualifications for faculty positions in the US and elsewhere.
And will someone please explain to me why Sotomayor's childlessness is more reflective of contemporary American society than Roberts's and Alito's two children apiece. Or is Professor Lazos living in a parallel universe where the American fertility rate is zero?
Tuesday, August 04, 2009
From the Hawaii Star-Bulletin
David Ross, chairman of the University of Hawaii-Manoa Faculty Senate's Executive Committee, claims that the university's ranking performance means that they should not have to take a pay cut.
"Recently we heard the good news that the University of Hawaii Foundation had raised $330 million in charitable donations over a six-year period. What got less press attention was that the UH faculty had raised over $400 million in grant support, not over six years but in a single year. At the same time we learned that top UH executives, who earn mainly at or above the national average, were taking voluntary pay cuts by up to 10 percent, while lower-level executives would be cut 6 percent to 7 percent. Meanwhile, UH faculty, who (despite some recent raises) still earn well below our colleagues at peer institutions, are being asked to take a 15 percent cut. ...
By many independent measures, UH-Manoa remains one of the great universities in the world. We're one of only 63 public universities in the country with the highest Carnegie Foundation classification. The best-known international ranking of universities ranks us as tied for 59th in the Western Hemisphere.
These rankings are based on the quality of our faculty and programs, not our buildings or athletic records. At this level UHM is in intense competition for the best faculty, grants and students. It is not a coincidence that our successes in recent years, academic and financial, have followed the rebuilding of our faculty, both in size and in salary. We are worried that
decisions being made right now by the state and the system will not only undo the recent progress we have made, but cause irreversible harm to our competitive standing. We are already losing faculty, and the cuts will make it di cult to recruit outstanding new faculty members and the research programs that they can develop. Since university rankings are based primarily on a faculty's reputation and grants, our hard-earned status as a research-extensive university could fall into jeopardy. "
Sad, but there's an army of adjuncts and underemployed Ph Ds out there who would work at those reduced salaries or less and who are just as or better qualified.