Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Tuesday, November 20, 2007
Note: corrections have been made to an earlier draft. Some of the figures for 2006 have been changed.
No, I am not being sarcastic. Information just released by QS, the consultants who prepare the data for the THES-QS rankings, shows that Malaysian universities have improved quite significantly in some respects over the last year.
QS have now published detailed information on the top 400 universities in the 2007 rankings. This confirms what I had suspected, namely that there has been no real decline in the quality of any Malaysian university and that the apparent fall in the positions of Universiti Malaya (UM), Universiti Kebangsaan Malaysia (UKM), Universiti Sains Malaysia (USM) and Universiti Putra Malaysia (UPM) is largely the result of nothing more than a change in methodology.
There is no point in comparing the scores in 2006 and 207 for the various components because of changes in methods this year. Basically, the introduction of Z scores means that any such comparison has no meaning. Last year, for example, UM was given a score of 1 for citations and 14 this year. That does not mean anything since the mean score among the top 400 universities for citations was 9 in 2006 and 66 in 2007. To measure genuine change it is necessary to look at the relative positions of the universities.
I have calculated the position of the Malaysian universities on the various criteria in 2006 and 2007 . In both years I have looked only at the top 400 since information for the universities below the 400th place in 2007 is not currently available. .
In 2006, University Malay was 90th for the "peer review", 238th for recruiter rating, 274th for student faculty ratio , 245th for international faculty, 308th for international students and 342th for citations per faculty. UM managed to get into the top 200 because the score for "peer review" was given a much larger weighting than any other criterion.
This year , UM was 131st for the "peer review", 159th for recruiter rating, 261st for student faculty ratio , 146th for international faculty, 241st for international students and 377th for citations per faculty.
Thus, if QS are to be believed, UM has improved its standing with local employers, recruited more teaching staff, and increased the numbers of international faculty and students, all relative to other universities . It did slightly worse on citations per faculty.
The only serious blemish was that UM did rather worse on the "peer review", perhaps because as QS has suggested, respondents were no longer allowed to vote for their own institutions.
So, how could UM suffer such a catastrophic fall?
The answer lies in the in the use of Z scores. To summarise a Z score is constructed when the population mean is subtracted from the raw score, divided by the standard deviation and then normalised,.
The effect of this is to squash scores together at the top and not at the bottom as was previously the case. To go back to the scores for citations, in 2006 UM got a score of 1 for citations, which was quite a bit below average. In 2007, because in the introduction of Z scores, the average was much higher. So UM got 1 for citations in 2006 and Peking ("Beijing" then) got 2. This year UM got 14 and Peking got 53. So Peking got an extra 39 points instead of one.
The switch from ISI to Scopus may also had had some effect but probably not all that much.
Similarly, we find that UPM improved its position for two criteria and USM and UKM for 3 each. All suffered a decline on the "peer review".
UM's fall in the "peer review" section did not make a dramatic difference. Had UM remained in 90th place it would have made a difference of only ten 10 points, 76 instead of 66, for that section.
UM 's supposed tumble happened solely because universities that are doing a bit more research are now getting a lot more points than before.
There has been no decline. Maybe Malaysian universities are not improving fast enough but that is quite a different thing from what the rankings appear to show and what is causing so much anxiety among Malaysian commentators.
Monday, November 19, 2007
- Imperial College Press is a joint venture of Imperial College and World Scientific.
- World Scientific is a Singapore-based publishing company whose subscription list is used by QS to construct their "peer review".
- Imperial gets a perfect score of 100 (rounded) for the "peer review" in the 2007 THES-QS rankings.
- Until last year, citation data were collected for QS by Evidence Ltd, a company headed by a former Imperial faculty member.
- QS gave Imperial a much better student faculty ratio than even the college itself claimed.
- Imperial is, according to the THES-QS rankings, the fifth best university in the world.
- Richard Sykes, Vice-chancellor of Imperial, is the second highest paid in the UK.
- Richard Sykes wants a massive increase in fees.
- Imperial is now the most popular UK destination for Singapore students.
- Richard Sykes is on QS's questionnaire telling respondents that it takes smart people to recognise smart people.
Is it conceivable that some of these might just possibly have something to do with one another?
One of the most remarkable things about the THES-QS rankings is the steady rise of Imperial College London. It has now reached 5th place, just behind Oxford, Cambridge and Yale and ahead of Princeton, MIT, Stanford and Tokyo.
How did this happen? Imperial's research performance is rather lacklustre compared with many American universities. The Shanghai Jiao Tong index puts it at 23rd overall, 33rd for highly cited researchers , 28th for publications in Science and Nature, and 29th for citations in the Science Citation Index.
Google Scholar also indicates that Imperial does much worse than many other places. A quick search comes up with 22,500 items for research published since 2002, compared to 22, 700 for Seoul National University, 25,800 for McGill, 44,00 for Tokyo and 151,00 for Princeton.
Imperial does well on the THES QS rankings partly because of outstanding scores on the peer review (99 out of 100) , employer review (99) and international students (100).
It also comes first (along with 15 others with scores of 100) for student faculty ratio. Is this justified?
On its web site QS indicates that Imperial has 2,963 full time equivalent (FTE) faculty and 12,025 FTE students, a ratio of 4.06.
However, if we look at Imperial's site we find that the college claims 12,129 FTE students and 1,114 academic and 1,856 research staff.
It appears that QS has counted both academic and research staff when calculating Imperial's ratio. Looking at other universities, it appears that it is QS's standard practice to count research staff who do not teach as part of the faculty total. In contrast, Imperial itself calculates the ratio by dividing students by academic staff to produce a ratio of 11.2. If that ratio had applied Imperial would have been many places lower.
If QS has been counting research staff in the total faculty score it would lead to the truly bizarre result that universities could hire a large number of researchers and get a substantial boost for the student faculty score.
So far it looks as though this is s general procedure and not a special privilege granted to Imperial alone but it would introduce a definite bias in favour of those universities that, like Imperial, employ large numbers of non-teaching researchers.
Saturday, November 17, 2007
The QS topuniversities site does not provide information about ranking components for universities outside the top 200. It does , however, provide links to pages with the raw data and their sources, for which they are to be highly commended.
It seems that this year Macquarie is recorded as having 1,018 full time equivalent faculty (865 headcount) and 255 full time equivalent international faculty (267 headcount). So, 25 % of Macquarie's faculty are international.
The information on total faculty was submitted by Baerbel Eckelmann on 8/10/07 and on international faculty by "Director" on 15/6/07.
The figure for 2006 was presumably much higher and incorrect. It would be interesting if Macquarie or QS could indicate how it was derived.
It would seem that one reason for the apparent decline of Macquarie was simply the correction of a previous error.
The QS site topuniversities has lists of the top 400 and top 401 - 50 universities in the 2007 THES-QS rankings.
Friday, November 16, 2007
There have been quite different reactions to the latest THES-QS rankings in the USA and Australia. It seems that nobody has noticed that Washington University in St. Louis fell from 48th place in 2006 to 161st this year. But there has been a great deal of discussion about why Macquarie University fell from 82nd to 168th.
It is difficult to figure out exactly what happened since the introduction of a new scoring method makes it difficult to compare 2006 and 2007 but it is possible to compare the relative positions in these years of a university for the components of the rankings.
In 2006 Macquarie was 93rd in the "peer review", 46th for recruiter rating, 198th for student faculty rating, 159th for citations per faculty, 1st for international faculty and 13th for international students.
In 2007 Macquarie was 142nd for the "peer review", 62nd for recruiter rating, 189th for student faculty ratio, 190th for citations per faculty, 55th for international faculty and 11th for international students.
It seems that the decline of Macquarie is due primarily to a poorer score for the "peer review", possibly because of a change in QS's methodology that meant that respondents could not select their own institutions, and to a dramatic fall from first place for numbers of international faculty.
It is impossible that the latter represents a real change over the year unless Macquarie has been expelling hundreds of international lecturers. Either QS used the wrong figure for 2006 and the correct one this year or they got it right in 2006 but made a mistake this year.
The administration at Macquarie ought to be able to answer these questions:
Did Macquarie provide QS with any information about international faculty in 2006 and in 2007? If so, what information was given and was it correct?
Sunday, November 11, 2007
Malaysian Universities
Since THES has suggested that Malaysian and Singaporean universities suffered in this year's rankings because respondents were not allowed to vote for their own institutions and since this is obviously not true of the Singaporean universities, who got scores of 100 and 84 for the "peer review", I think it would be a good idea to wait a bit before making assumptions about the cause of the apparent decline of Malaysian universities. It is not totally impossible that QS has made another error or errors.
Friday, November 09, 2007
This year QS has introduced several "methodological enhancements" into the THES-QS rankings. One is the use of Z-scores. Basically, this means that the mean for all universities is deducted from the raw score and the result is then divided by the standard deviation. In effect, the score represents not an absolute number but how far each university is from the average. One consequence of using Z-scores is that differences at the very top are reduced.
In principle this is not a bad idea and other rankers do it but it has produced some odd results in this case.
In the survey of academic opinion, for example, the following universities all get a maximum score of 100: Harvard, Cambridge, Oxford, Yale, Caltech, MIT, Columbia, McGill, Australian National University, Stanford, Cornell, Berkeley, Melbourne, British Columbia, National University of Singapore, Peking and Toronto.
Do THES and QS really expect us to believe that Melbourne, British Columbia and Peking are just as good at research as Harvard ? Especially since Harvard is far ahead on every one of the subject rankings?
THES has a headline about fine tuning revealing distinctions. Really?
The National University of Singapore is among the best in Asia and has always been ranked highly by THES-QS. This year, however, it has fallen from 19th to 33rd.
THES suggest that Malaysian and Singaporean universities have suffered because the "peer review" no longer allows respondents to pick their own institutions. This would, not however, seem to apply to NUS -- and I wonder whether it applies to Malaysian universities either -- which got the maximum score of 100 (along with Oxford, Harvard and Caltech) on the survey. What happened was that NUS scored very poorly on the faculty student section.
It got 100 for "peer review", international faculty and international students, 93 for recruiter review, 84 for citations per faculty and 34 for faculty student ratio.
NUS has a self-reported ratio of about 17 students per faculty. Peking reports about 10 but QS gives it a score of 98, almost the same as Caltech at 100 with a well known ratio of about three.
There is something about this that needs some explanation.
The Kuala Lumpur New Straits Times has a report on the performance of Malaysian universities on the latest THES-QS rankings
Malaysian universities are on a slippery slope. None of them made it to the top 200 placing in the Times Higher Education Supplement (THES)-Quacquarelli Symonds (QS) World University Rankings this year.
This poor showing comes on the back of a recent government survey of local public universities which found that none deserved a place in the outstanding category. Last year, Universiti Kebangsaan Malaysia and Universiti Malaya made it to the top 200 in the THES-QS rankings. UKM ranked 185th, up from the 289th spot in 2005, beating well-known universities like University of Minnesota in the United States and University of Reading, Britain. This year, it has fallen to 309th. Similarly UM, which was ranked among the world's top 100 universities three years ago, was in 169th position in 2005 and tied with University of Reading in the 192nd spot last
year. It has dropped to the 246th spot. Universiti Sains Malaysia has fallen
to 307 from 277 last year.
UKM and UM vice-chancellors attributed their fall to the new methodology used to calculate rankings this year."Even the National University of Singapore (NUS) has dropped to the 33rd spot when it was always within the top 10," Universiti Malaya vice-chancellor Datuk Rafiah Salim said."The way I look at it, smaller countries like Malaysia are bound to lose out as THES has introduced new criteria which is peer review and has changed the citation and list of publications."Rafiah said with more
than 3,000 universities getting ranked by THES annually, Malaysian
universities had to improve if they wanted to remain on top of the list."If
we want to compete with some of the top universities in the world, first we
have to be in the same league. "Right now, we are not. One way to overcome
that is through adequate funding."She said NUS received an annual funding of
S$1.2 billion (RM2.7 billion) a year compared to UM's RM400 million annual
budget.
There is no mention of Universiti Putra Malaysia or Universiti Teknologi Malaysia both of which were on the list of universities sent out by QS this year.
It is impossible to be sure until the full data is released but I suspect that the "decline" of Malaysian universities has nothing to do with any real change but with QS preventing survey respondents from voting for their own institutions this year.
Thursday, November 08, 2007
The THES-QS Top 200 universities list is available here.
There is a press release here.
Some Highlights
The two Malaysian universities, UM and USM, are out of the top 200. Most probably this is because of new procedures for the "peer review".
Berkeley, National University of Singapore, Peking (well done QS for getting the name right), and LSE have fallen dramatically.
The IITs and IIMs are out of the top 200, maybe out of the rankings altogether.
Two Brazilian universities have risen dramatically.
Changes such as these could not possibly result from real changes but are most likely the consequence of "methodological enhancements", errors or the correction of errors.
The Sydney Morning Herald Herald reports on the THES-QS`rankings. Macquarie has fallen from 82 to 168.
AUSTRALIAN universities have slipped in one of the most respected world rankings. The most dramatic drop was suffered by Macquarie University - jeopardising a $100,000 bonus for its vice-chancellor, Steven Schwartz.
His bonus depends on improving Macquarie University's ranking in the Australian sector, but it has plummeted from 82 to 168 in the Times Higher Education Supplement's annual survey, released in Britain overnight. It has dropped from seventh to ninth among local universities.
Professor Schwartz, an American academic who had previously been head of Brunel University in Britain, replaced Di Yerbury in a messy coup last year. There was a bitter dispute between Macquarie and Professor Yerbury over ownership of paintings and other material she had accumulated over 19 years.
We will have to wait until the online results are available but the fall of Macquarie and perhaps of Steven Schwartz may have something to do with a reported change in the percentage of international faculty or possibly the introduction of z scores in the rankings. Last year Macquarie held top place for international faculty but QS did not reveal how they got the information and Macquarie did not confirm what the correct number was. Given the money at stake, it would not be totally astonishing if the 2006 figure for international faculty had been massaged a little bit.
The Economic Times of India has a report on the THES-QS`rankings:
Three Latin American universities make it to the world’s top 200, while even Africa makes a debut, with Cape Town ranked at 200. IIMs and IITs are not universities.
According to Martin Ince, who compiles and edits the survey, “The 2007 THES-QS World University Rankings are the most rigorous and complete so far. They show that the US and the UK model of independent universities supported with significant state funding produces great results.”
UK universities are closing in on their American counterparts, with University College, London, making it into the top 10 for the first time, and Imperial College, London, moving up from 9th to 5th this year. Chicago too, is a first time entrant into the top 10.
While the top 10 list is still restricted to the US and the UK universities, in the top 50, the addition to the Netherlands, 12 countries are featured in the top 50 compared to 11 in 2006.
Universities of Tokyo, Hong Kong, Kyoto, National University of Singapore, Peking, Chinese University of Hong Kong, Tsinghua and Osaka lead Asian higher education, all featuring in the top 50. The top 100 sees the number of Asian universities increase to 13 (12 in 2006), while the number of European Universities has dropped to 35 (41 in 2006).
North America strengthened its tally to 43 Universities (37 in 2006). McGill tops in Canada, and a number of universities from New Zealand and Australia have also joined the top 50 list.
The increasing trend in internationalisation is also borne out by the fact that 143 of the top 200 universities reported an increase in their percentage of international faculty to total faculty, while 137 of the top 200 universities reported an increase in their percentage of international students to total students.
The last comment is rather interesting. Is this genuine internationalisation or simply a manipulation of data provided by universities?
Education Guardian
BBC
Chronicle of Higher Education
Beerkens Blog has the top 100. Here are the top 20.
Rank | Name | Country |
1 | HARVARD University | United States |
2= | University of CAMBRIDGE | United Kingdom |
2= | YALE University | United States |
2= | University of OXFORD | United Kingdom |
5 | Imperial College LONDON | United Kingdom |
6 | PRINCETON University | United States |
7= | CALIFORNIA Institute of Technology (Caltech) | United States |
7= | University of CHICAGO | United States |
9 | UCL (University College LONDON) | United Kingdom |
10 | MASSACHUSETTS Institute of Technology (MIT) | United States |
11 | COLUMBIA University | United States |
12 | MCGILL University | Canada |
13 | DUKE University | United States |
14 | University of PENNSYLVANIA | United States |
15 | JOHNS HOPKINS University | United States |
16 | AUSTRALIAN National University | Australia |
17 | University of TOKYO | Japan |
18 | University of HONG KONG | Hong Kong |
19 | STANFORD University | United States |
20= | CORNELL University | United States |
20= | CARNEGIE MELLON University | United States |
The Canadian newspaper The Gazette has a report on the performance of Canadian universities in the 2007 rankings. McGill has risen from 21st place to 12th. The Gazette reports that:
McGill University is the cream of Canadian schools, the best public university in North America and ranks 12th among the world's top 200 universities, according to a prestigious global survey.
Released today, the Times Higher Education Supplement has McGill bounding up from last year's 21st place showing based on such factors as emphasis on science programs, the strong contingent of international students and faculty, student/faculty ratios, and publications by faculty and graduate researchers. Harvard placed first on the list, while Oxford, Cambridge and Yale tied for second spot.
The report puts McGill ahead of such research-intensive powerhouses as Duke, Johns Hopkins, Stanford and Cornell. Findings are based on a combination of facts and opinions, with more than 5,000 academics around the world invited to rate a given institution. A key change in methodology this year made it impossible for professors to rate their own school.
"I'm really thrilled," said McGill principal Heather Munroe-Blum, who sees the results as a vindication of McGill's disciplined approach to academic planning, targeted hiring of 800 new professors and efforts to enhance both research and the undergraduate experience.
There are even more spectacular rises by Montreal (181 to 108), Queens (176 to 88), Waterloo (204 to 112), Western Ontario (215 to 126) and Simon Fraser (266 to 166).
Changes like this are most unlikely to be produced by real improvements by the universities concerned. Either the methodological changes introduced by QS`are having a greater impact than expected or some serious errors have occurred.
Also, if the statement about 5,000 academics originated from QS does this mean that this year QS sent out only 5,000 e-mails instead of nearly 200,000 as they claimed they did last year or that they received 5,000 forms or that they counted 5,000?
Three British universities in the THES-QS Top Five
Bits and pieces about the THES-QS 2007 rankings are appearing in online newspapers. Here is a quotation from the London Times
Cambridge and Oxford are the second best universities in the world according to the latest rankings, and British universities are closing the gap with those in the United States.
Oxford and Cambridge share the number two spot with Yale, with Harvard ranked number one in the latest league tables from The Times Higher Education Supplement.
The findings will bring cheer to the higher education sector in Britain at a time of growing concern among vice-chancellors and employers that British universities will lose students to better-financed institutions abroad and that business will then follow them with jobs and investment.
The commercial implications of the rankings are made very clear:
Professor Rick Trainor, the president of Universities UK, representing vice-chancellors, added: “Our competitors are increasingly marketing themselves more aggressively so it is vital that the UK remains among the foremost destinations for international students and staff.”
Harvard, whose endowment of $35 billion (£16.6 billion) is roughly equal to the combined annual funding for all English universities, tops the table, but its lead over its closest rivals has fallen from 3.2 to 2.4 points. Nunzio Quacquarelli, the managing director of QS, the careers and education group that compiled the rankings, said: “In an environment of increasing student mobility, the UK is putting itself forward as a top choice for students worldwide.“They are taking a closer look at the quality of faculty, international diversity and, of course, to the education they will receive.”
A detailed analysis will have to wait until the component scores are available but the continued closing of the gap between Oxbridge and Harvard and the rise of University College London from 25th to 9th and Imperial College London from 8th to 5th are rather suspicious.
The top ten are
1 Harvard University US
2 University of Cambridge UK
2 University of Oxford UK
2 Yale University US
5 Imperial College, London UK
6 Princeton University US
7 California Institute of Technology (Caltech) US
7 University of Chicago US
9 University College London (UCL) UK
10 Massachusetts Institute of Technology (MIT) US
Tuesday, November 06, 2007
Changes in the THES-QS Rankings
QS Quacquarelli Symonds have announced that the 2007 World Universities Rankings will be published on November 9th and that there will be a number of changes.
Firstly, QS will not allow respondents to their academic survey to vote for their own institutions. I am not sure how this can could be enforced if QS send out over a quarter of a million e-mails to World Scientific subscribers but it would in principle appear to a be a sensible change. However, this in itself will not affect other more serious problems with the “peer review” such as its marked regional bias and a suspiciously and unbelievably low response rate.
The second change is that QS will now use Scopus rather than the Web of Science for data about citations. .This will favour universities outside the
I suspect that the difference will not be very great. The dominance of the
Thirdly, QS will give Full Time Equivalent (FTE) counts for numbers of students and faculty rather than headcounts. This would eliminate some of the worst errors in previous rankings such as those relating to Ecole Polytechnique and Ecole Normale Superieure. However, there could be problems if the procedure is not applied consistently. QS say that where an FTE number has not been supplied, one will be calculated from the relationship between headcount and FTE numbers at other institutions in the same country or region.
This raises questions about the country or region that is used for benchmarking and whether QS will indicate how the ratio between headcount and FTE`is derived. Also, it seems rather dangerous to allow universities to submit their own data.
Finally, QS will calculate z scores for all components. Basically a z score is calculated by subtracting the population mean from the raw score and then dividing by the standard deviation. The effect of this will be to flatten the curves for each component and to ensure that similar changes will have similar effects on each section of the ranking.
QS are to be commended for introducing these changes providing that they implemented transparently and competently. There is no point in calculating z scores if you enter data for every university in the wrong row, as someone did for the student faculty ratio in QS’s book Guide to the World’s Top Universities, and create hundreds of errors
I have a further reservation. QS seems to have done nothing about using a database for the “peer review” that is provided by an Asian based and orientated publishing company, explaining how they could get an unprecedentedly low response rate without filtering the data in some way or .giving a large weighting to such an obviously biased and suspect set of data. It will be interesting to recalculate the scores to see what they look like without the peer review.
The combined effect of these changes is likely to be that some countries outside the top 100 may go down several places, even though nothing has really changed, leading to anguished debates about declining standards.
Wednesday, October 17, 2007
This is from Plagiarism in Colleges in USA, a page by Donald B. Sandler. The similarities between the two cases of "citation infraction" by doctoral candidates, M. Jamil Hanifi and Glenn (or Glendal) Poshard, are striking but the difference in the fate of the two dissertation submitters is glaring. I wonder if the faculty panel who produced the report on Glenn Poshard's dissertation could read this without blushing.
Hanifi
M. Jamil Hanifi plagiarized material from a book and an essay in his doctoral dissertation at Southern Illinois University in 1969. Hanifi later published "his" dissertation in a book, of which "three of the nine substantive chapters ... were plagiarized." The author of the essay discovered the plagiarism in 1976, the author of the book discovered the plagiarism in 1977. Southern Illinois University learned of the plagiarism in 1981. At that time, Hanifi was a professor of anthropology at Northern Illinois University, who was being considered as a new chairman of the department. Tersely summarizing a long recital in the court's opinion, Hanifi was given the choice of resigning or being fired, Hanifi chose to resign. Hanifi then filed litigation that alleged that his resignation had been coerced. Hanifi v. Board of Regents, 1994 WL 871887 (Ill.Ct.Cl. 1994).
The court said the following regarding plagiarism:
- John LaTourette, the current president of Northern Illinois University, who was the vice-president and provost of that university in 1981, acknowledged that plagiarism is "probably the most serious charge against a faculty member that one could imagine." The president of the university in 1981, William Monat, similarly acknowledged that plagiarism is "probably one of the greatest offenses that can occur in the academic community." Mr. Hanifi, himself, has written to others and admitted during his testimony that plagiarism involves "a complete lapse in professional judgment, moral sense and respect for academic ethics," "a most serious violation with dishonor, shame and guilt," "unethical conduct," "dishonorable and unprofessional conduct," and "dishonorable act and reprehensible and condemnable," "a violation of basic scholarly activity and serious misconduct," "a despicable act and a serious mistake." Mr. Hanifi acknowledged that the plagiarism is not erasable.
- Id. at *2.
- The court concluded that Hanifi had failed to prove that his resignation had been coerced. Note the court's final sentence about the bad character of a plagiarist:
- From a thorough review of the evidence in this case, we find that the Claimant has failed to prove that his resignation was involuntary, coerced or the product of duress. The testimony of Claimant and Respondent's witnesses is at loggerheads. To believe Claimant's testimony as to coercion, duress and involuntariness, we would have to disbelieve numerous other witnesses and find some grand conspiracy among the top officials at Northern Illinois University to injure Claimant, which would include mass perjury. Claimant has presented no compelling evidence to corroborate his testimony and therefore in light of the credible testimony disputing his claim, we find his testimony incredible. Frankly, we do not believe this admitted plagiarizer when he claims his will was overcome and he did not know what he was doing.
- Id. at *6.
"a complete lapse in professional judgment, moral sense and respect for academic ethics," "a most serious violation with dishonor, shame and guilt," "unethical conduct," "dishonorable and unprofessional conduct," and "dishonorable act and reprehensible and condemnable," "a violation of basic scholarly activity and serious misconduct," "a despicable act and a serious mistake."
Compare these words with President Poshard's.
Sunday, October 14, 2007
Inside Higher Ed reports that Southern Illinois University (SIU) president Glenn Poshard has been cleared of deliberate plagiarism. It seems that a panel appointed by the university found that the president did not intentionally plagiarise his doctoral dissertation awarded by SIU in 1984.
The seven-person committee of senior faculty, whose report on Poshard was unveiled Thursday, recommended that the university take no action against the president. It calls for the dissertation to be withdrawn from the university library and be replaced with a corrected copy prepared by Poshard, and for the president to write a statement that expands on why his errors occurred and speaks more broadly about the “culture of integrity” at the university. (The panel noted that allegations of Poshard plagiarizing his master’s thesis follow the same pattern as those in the dissertation, so it chose to focus on the latter.)Still, the report is far from a ringing endorsement of Poshard’s past work. The committee notes that there are many cases in the dissertation in which “the words of others are present in a continuous flow with Student Poshard’s own words, so that readers cannot distinguish between those sources.” Given the modern-day definition of plagiarism at Southern Illinois’s Graduate School as “representing the work of another as one’s own work,” the report says the allegations against Poshard would be “sufficiently supported,” were it not for the historical context of the case.
But that context is vital, the report notes. At the time when Poshard was a graduate student at Southern Illinois, the graduate school’s student handbook lacked a definition of plagiarism. The panel found that Poshard had used an “informal style” of citing sources that was commonly embraced by other graduate students. Faculty members advising him on the dissertation approved the style then, and no one asked him to clarify anything at the time of submission, the report finds.
The mistakes were most likely products of “carelessness” and fall into the category of “inadvertent plagiarism,” according to the committee.
Poshard has denied the allegations of intentional wrongdoing but left open the possibility that he made accidental errors. During a meeting with the faculty panel, the president said his dissertation committee had no qualms with his style of citation, which often included scant inclusion of quotation marks.
“Even though the Review Committee says these mistakes were unintentional and inadvertent, they are my mistakes. And I take full responsibility for them,” he said at a news conference Thursday. “They are not the fault of my committee, my department, my college or my university.” He added that “whether one wants to argue whether what I did constitutes plagiarism depends on how you feel about me.”
The main points that emerge from the panels findings would seem to be that:
- a large part of Poshard's literature review consisted of bits of other people's writings some of which were cited but not quoted and some of which were neither quoted nor cited
- but, while corrections should be made, Poshard is excused because it was common for graduate students at SIU to use an "informal style" of citation (does that mean anything other than copying?) with the approval of their instructors and supervisors, because the graduate student handbook did not have a definition of plagiarism and because his dissertation committee did not find anything wrong with his citation style.
I hope that what follows is an accurate summary of the committee's findings. Poshard committed technical errors but was not seriously at fault because graduate students at SIU routinely constructed dissertations by copying pieces from other works.
I hope that somebody will examine a sample of dissertations from the Southern Illinois Education Faculty and find out whether the faculty has in fact been condoning mass plagiarism over the years.
It also might be an idea to take a close look at the dissertations and see whether SIU also had an "informal style of data collection".
This report is a remarkable document. If taken seriously it ought to make graduates of SIU over the last two and a half decades unemployable and call its accreditation into question.
Sunday, September 30, 2007
The Leiden Rankings
The Centre for Science and Technology Studies at the University of Leiden has developed a ranking system based on bibliometrics, that is the quantity of academic publication. The Centre has presented results for the largest 100 European (including Israel) universities.
The rankings use a multiple- indicator approach, using several methods of presenting the same data. The Centre explains:
The latter point is very important: on the basis of the same data and the same technical and methodological starting points, different types of impact-indicators can be constructed, for instance one focusing entirely on impact, and another in which also scale (size of the institution) is taken in to account. Rankings based on these different indicators are not the same, although they originate from exactly the same data. Moreover, rankings are strongly influenced by the size-threshold used to define the set of universities for which the ranking is calculated. This is very clearly illustrated by comparison of the top-100 versus the top-50 European universities.
The Centre provides four different Rankings. The top ten in the yellow list, number of publications, is as follows.
1. Cambridge
2. University College London
3. Oxford
4. Imperial College London
5. Munich
6. Pierre and Marie Curie (Paris 6)
7. Milan
8. Utrecht
9. Catholic University Leuven (Flemish)
10. Manchester
The green ranking is by "size-independent, field-normalized average impact "
1. Oxford
2. Cambridge
3. ETH Zurich
4. Lausanne
5. Geneva
6. Edinburgh
7. University College London
8. Erasmus Rotterdam
9. Imperial College London
10. Basel
The blue ranking is by citations per publication
1. Lausanne
2. Geneva
3. Oxford
4. Basel
5. Karolinska Institute Sweden
6. Erasmus Rotterdam
7. University College London
8. Cambridge
9. Edinburgh
10. Zurich
The orange ranking is by "size-independent, field-normalized average impact "
1. Cambridge
2. Oxford
3. University College London
4. Imperial college London
5. Utrecht
6. Helsinki
7. ETH Zurich
8. Catholic University of Leuven (Flemish)
9. Munich
10. Karolinska
See the Centre's web site for an explanation of the technical terminology.
What is interesting about these rankings is that continental European universities are closing in on Oxford, Cambridge and the London colleges and when it comes to the citation of academic publications are beginning to move ahead.
As a measure of current research performance these rankings are superior to Shanghai Jiao Tong which counts long-dead Nobel prize winners.
Thursday, September 27, 2007
The Journal of Blacks in Higher Education has published a ranking of 26 competitive American universities according to the numbers and success of African American students and faculty.
Unlike other ranking efforts in the field of higher education, our statistics, without exception, are highly quantitative. This is in sharp contrast to highly impressionistic institutional rankings such as those compiled by U.S. News & World Report in which 25 percent or more of the total ranking score is derived from subjective surveys of university reputations as determined by presidents, provosts, and deans of admissions at other institutions.
All JBHE data is obtained from our own in-house surveys of the colleges and universities as well as from government sources. Each year JBHE surveys university and college admissions offices to obtain data on applicants, acceptances, first-year enrollments, and black student yield. On a regular basis we also survey deans of faculty at these universities for statistical information on their numbers and percentages of black faculty and black tenured faculty.
While one may disagree over what measuring factors are most important, the data we collect is broad-based, solid, quantifiable, and not subject to dispute.
5. Vanderbilt
6. North Carolina at Chapel Hill
7. Georgetown
8. Harvard
9. Virginia
10. Brown
Tuesday, September 25, 2007
Imperial College London has announced that the Principal of its Faculty of Medicine has been appointed to a joint position as Principal of the Faculty of Medicine and Chief Executive of the Imperial College Healthcare NHS Trust.
The report notes that Imperial College "is ranked fourth in the world for biomedical research in the Times Higher Education Supplement world university rankings".
It does not note that, according to the THES's own data on citations per paper for medical research, the college is ranked 28th. Nor that according to the Shanghai Jiao Tong rankings it is ranked 25th for clinical medicine and pharmacy.
Nor that the survey that produced the fouth place ranking used a database provided by a publishing company with close ties to Imperial and that the data on research was collected by a company headed by a former faculty member.