Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, August 24, 2017
Comment by Christian Scholz
This comment is by Christian Schulz of the University of Hamburg. He points that the University of Hamburg's rise in the Shanghai rankings was not the result of highly cited researchers moving from other institutions but the improvement of research within the university.
If this is something that applies to other German universities, then it could be that Germany has a policy of growing its own researchers rather than importing talent from around the world. It seems to have worked very well for football so perhaps the obsession of British universities with importing international researchers is not such a good idea..
If this is something that applies to other German universities, then it could be that Germany has a policy of growing its own researchers rather than importing talent from around the world. It seems to have worked very well for football so perhaps the obsession of British universities with importing international researchers is not such a good idea..
I just wanted to share with you, that we did not acquire two researchers to get on the HCR List to get a higher rank in the Shanghai Ranking. Those two researchers are Prof. Büchel and Prof. Ravens-Sieberer. Prof. Büchel is working at our university for over a decade now and Prof. Ravens-Sieberer is at our university since 2008.
Please also aknowledge, that our place in the Shanghai Ranking was very stable from 2010-2015. We were very unpleasent, when they decided to only use the one-year list of HCR, because in 2015 none of our researchers made it on the 2015-list, which caused the descend from 2015 to 2016.
Guest Post by Pablo Achard
This post is by Pablo Achard of the University of Geneva. It refers to the Shanghai subject rankings. However, the problem of outliers in subject and regional rankings is one that affects all the well known rankings and will probably become more important over the next few years
How a single article is worth 60 places
We can’t repeat it
enough: an indicator is bad when a small variation in the input is overly
amplified in the output. This is the case when indicators are based on very few
events.
I recently came
through this issue (again) with Shanghai’s subject ranking of universities. The
universities of Geneva and Lausanne (Switzerland) share the same School of Pharmacy and a huge share of published articles in this discipline are signed
under the name of both institutions. But in the “Pharmacy and pharmaceutical
sciences” ranking, one is ranked between the 101st and 150th
position while the other is 40th. Where does this difference come
from?
Comparing the scores
obtained under each category gives a clue
|
Geneva
|
Lausanne
|
Weight in the final score
|
PUB
|
46
|
44.3
|
1
|
CNCI
|
63.2
|
65.6
|
1
|
IC
|
83.6
|
79.5
|
0.2
|
TOP
|
0
|
40.8
|
1
|
AWARD
|
0
|
0
|
1
|
Weighted sum
|
125.9
|
166.6
|
|
So the main difference
between the two institutions is the score in “TOP”. Actually, the difference in
the weighted sum (40.7) is almost equal to the value of this score (40.8). If
Geneva and Lausanne had the same TOP score, they would be 40th and
41st.
Surprisingly, a look
at other institutions for that TOP indicator show only 5 different values : 0,
40.8, 57.7, 70.7 and 100. According to the methodology page of the ranking, “TOP
is the number of papers published in Top Journals in an Academic Subject for an
institution during the period of 2011-2015. Top Journals are identified through
ShanghaiRanking’s Academic Excellence Survey […] The
list of the top journals can be found here […] Only
papers of ‘Article’ type are considered.”
Looking deeper, there
is just one journal in this list for Pharmacy: NATURE REVIEWS DRUG DISCOVERY.
As its name indicates, this recognized journal mainly publishes ‘reviews’. A search
on Web of Knowledge shows that in the period 2011-2015, only 63 ‘articles’ were
published in this journal. That means a small variation in the input is overly
amplified.
I searched for several
institutions and rapidly found this rule: Harvard published 4 articles during
these five years and got a score of 100 ; MIT published 3 articles and got a
score of 70.7 ; 10 institutions published 2 articles and got a 57.7 and finally
about 50 institutions published 1 article and got a 40.8.
I still don’t get why
this score is so unlinear. But Lausanne published one single article in NATURE
REVIEWS DRUG DISCOVERY and Geneva none (they published ‘reviews’ and ‘letters’
but no ‘articles’) and that small difference led to at least a 60 places gap
between the two institutions.
This is of course just
one example of what happens too often: rankers want to publish sub-rankings and
end up with indicators where outliers can’t be absorbed into large
distributions. One article, one prize or one co-author in a large and
productive collaboration all of the sudden makes very large differences in final
scores and ranks.
Friday, August 18, 2017
Comment on the 2017 Shanghai Rankings
In the previous post I referred to the vulnerabilities that have developed in the most popular world rankings, THE, QS and Shanghai ARWU, indicators that have a large weighting and can be influenced by universities that know how to work the system or sometimes are just plain lucky.
In the latest QS rankings four universities from Mexico, Chile, Brazil and Argentina have 90+ scores for the academic reputation indicator, which has a 40% weighting. All of these universities have low scores for citations per faculty which would seem at odds with a stellar research reputation. In three cases QS does not even list the score in its main table.
I have spent so much time on the normalised citation indicator in the THE world and regional rankings that I can hardly bear to revisit the issue. I will just mention the long list of universities that have achieved improbable glory by a few researchers, or sometimes just one, on a multi-author international physics, medical or genetics project.
The Shanghai rankings were once known for their stability but have become more volatile recently. The villain here is the highly cited researchers indicator which has a 20% weighting and consists of those scientists included in the lists now published by Clarivate Analytics.
It seems that several universities have now become aware that if they can recruit a couple of extra highly cited researchers to the faculty they can get a significant boost in these rankings. Equally, if they should be so careless to lose one or two then the ranking consequences could be most unfortunate.
In 2016 a single highly cited researcher was worth 10.3 points in the Shanghai rankings, or 2.06 on the overall score after weighting, which is the difference between 500th place and 386th. That is a good deal, certainly much better than hiring a team of consultants or sending staff for excruciating transformational sharing sessions
Of course, as the number of HiCis increases the value of each incremental diminishes so it would make little difference if a top 20 or 30 university added or lost a couple of researchers.
Take a look at some changes in the Shanghai rankings between 2016 and 2017. The University of Kyoto fell three places from 32nd to 35th place or 0.5 points from 37.2 to 36.7. This was due to a fall in the number of highly cited researchers from seven to five which meant a fall of 2.7 in the HiCi score or a weighted 0.54 points in the overall score.
McMaster University rose from 83rd to 66th gaining 2.5 overall points. The HiCi score went from 32.4 to 42.3, equivalent to 1.98 weighted overall points, representing an increase in the number of such researchers from 10 to 15.
Further down the charts,the University of Hamburg rose from 256th with an overall score of 15.46 to 188th with a score of 18.69, brought about largely by an improvement in the HiCi score from zero to 15.4 which was the result of the acquisition of tworesearchers.
Meanwhile the Ecole Polytechnique of Paris fell from 303rd place to 434th partly because of the loss of its only highly cited researcher.
It is time for ShanghaiRanking to start looking around for a Plan B for their citations indicator.
In the latest QS rankings four universities from Mexico, Chile, Brazil and Argentina have 90+ scores for the academic reputation indicator, which has a 40% weighting. All of these universities have low scores for citations per faculty which would seem at odds with a stellar research reputation. In three cases QS does not even list the score in its main table.
I have spent so much time on the normalised citation indicator in the THE world and regional rankings that I can hardly bear to revisit the issue. I will just mention the long list of universities that have achieved improbable glory by a few researchers, or sometimes just one, on a multi-author international physics, medical or genetics project.
The Shanghai rankings were once known for their stability but have become more volatile recently. The villain here is the highly cited researchers indicator which has a 20% weighting and consists of those scientists included in the lists now published by Clarivate Analytics.
It seems that several universities have now become aware that if they can recruit a couple of extra highly cited researchers to the faculty they can get a significant boost in these rankings. Equally, if they should be so careless to lose one or two then the ranking consequences could be most unfortunate.
In 2016 a single highly cited researcher was worth 10.3 points in the Shanghai rankings, or 2.06 on the overall score after weighting, which is the difference between 500th place and 386th. That is a good deal, certainly much better than hiring a team of consultants or sending staff for excruciating transformational sharing sessions
Of course, as the number of HiCis increases the value of each incremental diminishes so it would make little difference if a top 20 or 30 university added or lost a couple of researchers.
Take a look at some changes in the Shanghai rankings between 2016 and 2017. The University of Kyoto fell three places from 32nd to 35th place or 0.5 points from 37.2 to 36.7. This was due to a fall in the number of highly cited researchers from seven to five which meant a fall of 2.7 in the HiCi score or a weighted 0.54 points in the overall score.
McMaster University rose from 83rd to 66th gaining 2.5 overall points. The HiCi score went from 32.4 to 42.3, equivalent to 1.98 weighted overall points, representing an increase in the number of such researchers from 10 to 15.
Further down the charts,the University of Hamburg rose from 256th with an overall score of 15.46 to 188th with a score of 18.69, brought about largely by an improvement in the HiCi score from zero to 15.4 which was the result of the acquisition of tworesearchers.
Meanwhile the Ecole Polytechnique of Paris fell from 303rd place to 434th partly because of the loss of its only highly cited researcher.
It is time for ShanghaiRanking to start looking around for a Plan B for their citations indicator.
Wednesday, August 16, 2017
Problems with global rankings
There is a problem with any sort of standardised testing. A test that is useful when a score has no financial or social significance becomes less valid when coaching industries workout how to squeeze a few points out of docile candidates and motivation becomes as important as aptitude.
Similarly, a metric used to rank universities may be valid and reliable when nobody cares about the rankings. But once they are used to determine bureaucrats' bonuses, regulate immigration, guide student applications and distribute research funding then they become less accurate. Universities will learn how to apply resources in exactly the right place, submit data in exactly the right way and engage productively with the rankers. The Trinity College Dublin data scandal, for example, has indicated how much a given reported income can affect ranks in the THE world rankings.
All of the current "big three" of global rankings have indicators that have become the source of volatility and that are given a disproportionate weighting. These are the normalised citations indicator in the THE rankings, the QS academic survey and the highly cited researchers list in the Shanghai ARWU.
Examples in the next post.
Similarly, a metric used to rank universities may be valid and reliable when nobody cares about the rankings. But once they are used to determine bureaucrats' bonuses, regulate immigration, guide student applications and distribute research funding then they become less accurate. Universities will learn how to apply resources in exactly the right place, submit data in exactly the right way and engage productively with the rankers. The Trinity College Dublin data scandal, for example, has indicated how much a given reported income can affect ranks in the THE world rankings.
All of the current "big three" of global rankings have indicators that have become the source of volatility and that are given a disproportionate weighting. These are the normalised citations indicator in the THE rankings, the QS academic survey and the highly cited researchers list in the Shanghai ARWU.
Examples in the next post.
Monday, August 14, 2017
Some implications of the Universitas 21 rankings
Universitas 21 (U21) produces an annual ranking not of universities but of 50 national university systems. There are 25 criteria grouped in four categories, resources, connectivity, environment and output. There is also an overall league table.
The resources section consists of various aspects of expenditure on tertiary education. Output includes publications, citations, performance in the Shanghai rankings, tertiary enrolment, graduates and graduate employment .
The top five in the overall rankings are USA, Switzerland, UK, Denmark and Sweden. No surprises there. The biggest improvements since 2013 have been by China, Malaysia, Russia, Saudi Arabia, Singapore and South Africa.
It is interesting to compare resources with output. The top ten for resources comprise six European countries, three of them in Scandinavia, Canada, the USA, Singapore and Saudi Arabia.
The bottom 10 includes two from Latin America, four, including China, from Asia, three from Eastern Europe, and South Africa.
There is a significant relationship correlation of .732 between resources and output. But the association is not uniform. China is in 43rd place for resources but is 21st for output. Saudi Arabia in the top ten for resources but 33rd for output. Malaysia is 11th for resources but 38th for output.
I have constructed a table showing the relationship between resources and output by dividing the score for output by resources and we get a table showing how efficient systems are at converting money into employable graduates, instructing students and doing research. This is very crude as is the data and the way in which U21 combines them but it might have some interesting implications
The top ten are:
1. China
2. USA
3. Italy
4. Russia
5. Bulgaria
6. Australia
7. UK
8. Ireland
9. Israel
10. Denmark
We have heard a lot about the lavish funding given to Chinese tertiary education. But it seems that China is also very good at turning resources into research and teaching.
The bottom ten are:
41. Austria
42. Brazil
43. Serbia
44. Chile
45. Mexico
46. India
47. Turkey
48. Ukraine
49. Saudi Arabia
50. Malaysia
At the moment the causes of low efficiency are uncertain. But it seems reasonable that the limitations of primary and secondary school systems and cultural attitudes to science and knowledge may be significant. The results of standardised tests such as PISA and TIMSS should be given careful attention.
The resources section consists of various aspects of expenditure on tertiary education. Output includes publications, citations, performance in the Shanghai rankings, tertiary enrolment, graduates and graduate employment .
The top five in the overall rankings are USA, Switzerland, UK, Denmark and Sweden. No surprises there. The biggest improvements since 2013 have been by China, Malaysia, Russia, Saudi Arabia, Singapore and South Africa.
It is interesting to compare resources with output. The top ten for resources comprise six European countries, three of them in Scandinavia, Canada, the USA, Singapore and Saudi Arabia.
The bottom 10 includes two from Latin America, four, including China, from Asia, three from Eastern Europe, and South Africa.
There is a significant relationship correlation of .732 between resources and output. But the association is not uniform. China is in 43rd place for resources but is 21st for output. Saudi Arabia in the top ten for resources but 33rd for output. Malaysia is 11th for resources but 38th for output.
I have constructed a table showing the relationship between resources and output by dividing the score for output by resources and we get a table showing how efficient systems are at converting money into employable graduates, instructing students and doing research. This is very crude as is the data and the way in which U21 combines them but it might have some interesting implications
The top ten are:
1. China
2. USA
3. Italy
4. Russia
5. Bulgaria
6. Australia
7. UK
8. Ireland
9. Israel
10. Denmark
We have heard a lot about the lavish funding given to Chinese tertiary education. But it seems that China is also very good at turning resources into research and teaching.
The bottom ten are:
41. Austria
42. Brazil
43. Serbia
44. Chile
45. Mexico
46. India
47. Turkey
48. Ukraine
49. Saudi Arabia
50. Malaysia
At the moment the causes of low efficiency are uncertain. But it seems reasonable that the limitations of primary and secondary school systems and cultural attitudes to science and knowledge may be significant. The results of standardised tests such as PISA and TIMSS should be given careful attention.
Sunday, August 13, 2017
The Need for a Self Citation Index
In view of the remarkable performance of Veltech University in the THE Asian Rankings, rankers, administrators and publishers need to think seriously about the impact of self-citation, and perhaps also intra-institutional ranking. Here is the abstract of an article by Justin W Flatt, Alessandro Blassime, and Effy Vayena.
Improving the Measurement of Scientific Success by Reporting a Self-Citation Index
Improving the Measurement of Scientific Success by Reporting a Self-Citation Index
Abstract
:
Who among the many researchers is most likely to usher in a new era of scientific breakthroughs? This question is of critical importance to universities, funding agencies, as well as scientists who must compete under great pressure for limited amounts of research money. Citations are the current primary means of evaluating one’s scientific productivity and impact, and while often helpful, there is growing concern over the use of excessive self-citations to help build sustainable careers in science. Incorporating superfluous self-citations in one’s writings requires little effort, receives virtually no penalty, and can boost, albeit artificially, scholarly impact and visibility, which are both necessary for moving up the academic ladder. Such behavior is likely to increase, given the recent explosive rise in popularity of web-based citation analysis tools (Web of Science, Google Scholar, Scopus, and Altmetric) that rank research performance. Here, we argue for new metrics centered on transparency to help curb this form of self-promotion that, if left unchecked, can have a negative impact on the scientific workforce, the way that we publish new knowledge, and ultimately the course of scientific advance.
Keywords:
publication ethics; citation ethics; self-citation; h-index; self-citation index; bibliometrics; scientific assessment; scientific successSaturday, August 12, 2017
The public sector: a good place for those with bad school grades
From the Economist ranking of British universities, which is based on the difference between expected and actual graduate earnings.
That, as Basil Fawlty said in a somewhat different context, explains a lot.
"Many of the universities at the top of our rankings convert bad grades into good jobs. At Newman, a former teacher-training college on the outskirts of Birmingham, classes are small (the staff:student ratio is 16:1), students are few (around 3,000) and all have to do a work placement as part of their degree. (Newman became a university only in 2013, though it previously had the power to award degrees.)
That, as Basil Fawlty said in a somewhat different context, explains a lot.
"Many of the universities at the top of our rankings convert bad grades into good jobs. At Newman, a former teacher-training college on the outskirts of Birmingham, classes are small (the staff:student ratio is 16:1), students are few (around 3,000) and all have to do a work placement as part of their degree. (Newman became a university only in 2013, though it previously had the power to award degrees.)
Part of Newman’s excellent performance can be explained because more than half its students take education-related degrees, meaning many will work in the public sector. That is a good place for those with bad school grades. Indeed, in courses like education or nursing there is no correlation between earnings and the school grades a university expects."
Friday, August 11, 2017
Malaysia and the Rankings Yet Again
Malaysia has had a complicated relationship with global university rankings. There was a fleeting moment of glory in 2004 when Universiti Malaya, the national flagship, leaped into the top 100 of the THES-QS world rankings. Sadly, it turned out that this was the result of an error by the rankers who thought that ethnic minorities were international faculty and students. Since then the country's leading universities have gone up and down, usually because of methodological changes rather than any merit or fault of their own.
Recently though, Malaysia seems to have adopted sensible, if not always popular, policies and made steady advances in the Shanghai rankings. There are now three universities in the top 500, UM, Universiti Sains Malaysia (USM) and Universiti Kebangsaan Malaysia (UKM). UM has been rising since 2011 although it fell a bit last year because of the loss of a single highly cited researcher listed in the Thomson Reuters database.
The Shanghai rankings rely on public records and focus on research in the sciences. For a broader based ranking with a consistent methodology and teaching metrics we can take a look at the Round University Rankings. There UM is overall 268th. For the 20 metrics included in these rankings UM's scores range from very good for number of faculty and reputation (except outside the region) to poor for doctoral degrees and normalised citations.
The story told by these rankings is that Malaysia is making steady progress in providing resources and facilities, attracting international students and staff, and producing a substantial amount of research in the natural sciences. But going beyond that is going to be very difficult. Citation counts indicate that Malaysian research gets little attention from the rest of the world. The Shanghai rankings report that UM has zero scores for highly cited researchers and papers in Nature and Science.
In this year's QS world rankings, UM reached 114th place overall and there are now hopes that it will soon reach the top 100. But it should be noted that UM's profile is very skewed with a score of 65.7 for academic reputation and 24.3 for citations per faculty. Going higher without an improvement in research quality will be very challenging since the reputation curve becomes very steep at this level, with dozens of survey responses needed just to go up a few points.
It might be better if Malaysia focused more on the Shanghai rankings, the Round University Rankings and the US News Best Global Universities. Progress in these rankings is often slow and gradual but their results are usually fairly consistent and reliable.
Recently though, Malaysia seems to have adopted sensible, if not always popular, policies and made steady advances in the Shanghai rankings. There are now three universities in the top 500, UM, Universiti Sains Malaysia (USM) and Universiti Kebangsaan Malaysia (UKM). UM has been rising since 2011 although it fell a bit last year because of the loss of a single highly cited researcher listed in the Thomson Reuters database.
The Shanghai rankings rely on public records and focus on research in the sciences. For a broader based ranking with a consistent methodology and teaching metrics we can take a look at the Round University Rankings. There UM is overall 268th. For the 20 metrics included in these rankings UM's scores range from very good for number of faculty and reputation (except outside the region) to poor for doctoral degrees and normalised citations.
The story told by these rankings is that Malaysia is making steady progress in providing resources and facilities, attracting international students and staff, and producing a substantial amount of research in the natural sciences. But going beyond that is going to be very difficult. Citation counts indicate that Malaysian research gets little attention from the rest of the world. The Shanghai rankings report that UM has zero scores for highly cited researchers and papers in Nature and Science.
In this year's QS world rankings, UM reached 114th place overall and there are now hopes that it will soon reach the top 100. But it should be noted that UM's profile is very skewed with a score of 65.7 for academic reputation and 24.3 for citations per faculty. Going higher without an improvement in research quality will be very challenging since the reputation curve becomes very steep at this level, with dozens of survey responses needed just to go up a few points.
It might be better if Malaysia focused more on the Shanghai rankings, the Round University Rankings and the US News Best Global Universities. Progress in these rankings is often slow and gradual but their results are usually fairly consistent and reliable.
Tuesday, August 08, 2017
Excellent Series on Rankings
I have just come across a site, ACCESS, that includes a lot of excellent material on university rankings by Ruth A Pagell, who is Emeritus Faculty Librarian at Emory University and Adjunct Faculty at the University of Hawaii.
I'll provide specific links to some of the articles later
Go here
I'll provide specific links to some of the articles later
Go here
Saturday, August 05, 2017
There is no such thing as free tuition
It is reported that the Philippines is introducing free tuition in state universities.It will not really be free. The government will have to find P100 billion from a possible "re-allocation of resources."
If there is a graduate premium for degrees from Philippine universities then this measure will increase existing social inequalities and result in a transfer of wealth from the working class and small businesses to the privileged educated classes.
Unless lecturers work for nothing and buildings and facilities materialize, Hogwarts style, out of nothing, tuition is never free.
If there is a graduate premium for degrees from Philippine universities then this measure will increase existing social inequalities and result in a transfer of wealth from the working class and small businesses to the privileged educated classes.
Unless lecturers work for nothing and buildings and facilities materialize, Hogwarts style, out of nothing, tuition is never free.
Who educates the world's leaders?
According to Times Higher Education (THE), the UK has educated more heads of state and government than any other country. The USA is a close second followed by France. No doubt this will get a let of publicity as the THE summit heads for London but, considering the state of the world, is it really something to be proud of?
Thursday, August 03, 2017
America's Top Colleges: 2017 Rankings
America's Top Colleges is published by Forbes business magazine. It is an unabashed assessment of institutions from the viewpoint of the student as investor. The metrics are post-graduate success, debt, student experience, graduation rate and academic success.
The top three colleges are Harvard, Stanford and Yale.
The top three liberal arts colleges are Pomona, Claremont McKenna and Williams.
The top three low debt private colleges are College of the Ozarks, Berea College and Princeton.
The top three STEM colleges are MIT, Caltech and Harvey Mudd College.
Wednesday, August 02, 2017
Ranking Rankings
Hobsons, the education technology company, has produced a ranking of global university rankings. The information provided is very limited and i hope there will be more in a while. Here are the top five according to a survey of international students inbound to the USA.
1. QS World University Rankings
2. THE World University Rankings
3. Shanghai ARWU
4. US News Best Global Universities
5. Center for World University Rankings (formerly published at King Abdulaziz University).
University of Bolton head thinks he's worth his salary
George Holmes, vice-Chancellor of the University of Bolton with a salary of GBP 220,120 and owner of a yacht and a Bentley, is not ashamed of his salary. According to an article by Camilla Turner in the Daily Telegraph, he says that he has had a very successful career and he hopes his students will get good jobs and have Bentleys.
The university is ranked 86th in the Guardian 2018 league table which reports that 59.2% of graduates have jobs or in postgraduate courses six months after graduation. It does not appear in the THE or QS world rankings.
Webometrics puts it 105th in the UK and 1846th in the world so I suppose he could claim to be head of a top ten per cent university.
Perhaps Bolton should start looking for the owner of a private jet for its next vice-Chancellor. it might do even better.
Tuesday, August 01, 2017
Highlights from the Princeton Review
Here are the top universities in selected categories in the latest Best Colleges Ranking from Princeton Review. The rankings are based entirely on survey data and are obviously subjective and vulnerable to sampling error.
Most conservative students: University of Dallas, Texas
Most liberal students: Reed College, Oregon
Best campus food: University of Massachusetts Amherst
Happiest students: Vanderbilt University, Tennessee
Party schools: Tulane University, Louisiana
Don't inhale: US Coast Guard Academy, Connecticut
Best college library: University of Chicago, Illinois
Best-run college: University of Richmond, Virginia
Most studious students: Harvey Mudd College, California
Most religious students: Thomas Aquinas College, California
Least religious students: Reed College, Oregon
Best athletic facilities: Auburn University, Alabama.
Most conservative students: University of Dallas, Texas
Most liberal students: Reed College, Oregon
Best campus food: University of Massachusetts Amherst
Happiest students: Vanderbilt University, Tennessee
Party schools: Tulane University, Louisiana
Don't inhale: US Coast Guard Academy, Connecticut
Best college library: University of Chicago, Illinois
Best-run college: University of Richmond, Virginia
Most studious students: Harvey Mudd College, California
Most religious students: Thomas Aquinas College, California
Least religious students: Reed College, Oregon
Best athletic facilities: Auburn University, Alabama.