Wednesday, July 09, 2014

The Noise

An article in Research in Higher Education by Shari Gnolek, Vincenzo Falciano and Ralph Kuncl discusses what is required for a university in the mid-30s like Rochester to break into the top 20 of the US News & World Report's  (USNWR) America's Best Colleges. The answer, briefly and bluntly, is a lot more than Rochester is ever going to have.

Universities in Russia, India, Pakistan, Malaysia, Indonesia and other places often promise that one day they will be in  the top 100 or 50 or 10 of one of the international rankings. It would be interesting to see how much money they propose to spend.

An intriguing  aspect of this paper is the concept of noise. The authors find that the USNWR rankings show a lot of volatility with universities bouncing up and down for no particular reason and that any change of four places or less can be regarded as a random fluctuation that should not give journalists or university administrators a heart attack or send them strutting around the campus.


One of the authors, a former vice provost at Johns Hopkins, told interviewers:

' “the trustees would go bananas” when Johns Hopkins dropped in the rankings. The administration would then have to explain what had happened.
“Every year Hopkins went from 15 to 16 to 15 to 16 – and I thought, ‘What a silly waste of energy,' ” Kuncl said in an interview Monday. (Johns Hopkins is currently No. 12.)
The paper found that small movements up or down in the rankings are more or less irrelevant. For most universities in the top 40, any movement of two spots or less should be considered noise, the paper said. For colleges outside the top 40, moves up or down of four spots should be thought of as noise, too.'

The amount of noise generated by a ranking is probably a good negative indicator of its reliability. We are after all dealing with institutions that receive millions of dollars in funds, produce thousands of papers and tens of thousands of citations, enroll thousands of students, graduate some of them, employer hundreds of bureaucrats and faculty and adjuncts and so on. We should not expect massive fluctuations from year to year.

I have calculated the average movement up or down the top 100 in Shanghai Jiao Tong University's Academic Ranking of World Universities  between  2011 and 12 and between 2012 and 2013 and the Times Higher Education (THE) and Quacquarelli Symonds (QS) World University Rankings for 2012 -2013. A university falling out of the top 100 altogether was counted as falling to 101st place.

In 2013 ARWU had  a problem with Thomson Reuters who were supposed to be preparing a new list of highly cited researchers so they simply recycled the scores for that indicator from the 2012 rankings. This reduced the volatility of the rankings somewhat so changes between 2011 and 2012 were also analysed.

Starting with the top 20 of ARWU between 2012 and 13 there was an average change of 0.25 place changes. Between 2011 and 12 there were 0.15. 

Between 2011 and 2012 the University of California San Francisco fell from 17th to 18th place, Johns Hopkins rose from 18th to 17th and Tokyo University fell from 20th to 21st. There were no other changes in the top twenty.


Moving on to the QS top 20 in 2012-2013 there were some significant changes, including Stanford rising from 15th to 7th.and the University of Michigan falling from 17th to 22nd. The average change was 1.7 places. 


At the top, the THE world rankings were somewhat less volatile than QS but much more so than ARWU. The average change was 1.2 and the biggest was University College London which fell from 17th to 21st.

So university administrators should be concerned about any change within  the top twenty of the ARWU but should not be bothered about a change of one or two places in the THE or QS rankings.

Moving on to the top 100, the average change in the ARWU was 1.66 places between 2012 and 2013 and 2.01 between 2011 and 2012. 

The biggest change between 2011 and 2012 was Goettingen which fell from 86th to the 101 -150th band.

In the QS rankings between 2012 and 2013 the average change in the top 100 was 3.97 Substantial changes include Boston University which fell from 64th to 79th and the University of Birmingham which rose from 77th to 62nd.


In the THE  top 100  the average change was 5.36 . Notable changes include Lund University falling from 82nd to 123rd, Montreal falling from 84th to 106th and  King's College London rising from 57th to 38th.


So, it can be concluded that the ARWU rankings are the most reliable. For the top 100, QS is more reliable than THE but the reverse is the case for the top 20.

How to explain these differences?  To be certain it would be necessary to look at the separate indicators in the three rankings but here are some thoughts.

The dominant indicator in the QS rankings is the academic survey which has a weighting of 40 %. Any fluctuation in the survey could have a disproportionate effect on the overall score. The most important single indicator in the THE rankings is Citations: Research Influence, which has a 30 %  weighting but contributes a higher proportion to total scores because the regional adjustment gives an extra boost to countries with a limited research base. In contrast, no indicator in the Shanghai rankings has more than a 20 % weighting.

The THE rankings include inputs such as income. An injection of research funds from a corporation would immediately  improve a university's position in the income from industry, research income and total institutional income indicators. It would be a few years before the funds produce an improvement, if they ever did, in the publications indicator in ARWU and even longer in the Citations per Faculty indicator in the QS rankings.

ARWU uses publicly available data that can be easily checked and is unlikely to fluctuate very much from year to year. THE and QS also use data submitted from institutions. There is room for error as data flows from branch campuses and research centres to the central administrators and then to the rankers. QS also has the option of replacing institutional data with that from third party sources.

So everybody should relax when reading this year's rankings. Unless your university has risen or fallen by more than two spots in ARWU, four in the QS rankings or six in THE's.





Saturday, July 05, 2014

University Mission Creep


The demands on Western universities appear limitless.

Recently, the White House issued a plan to evaluate American universities according to the value that they offer. Proposals for indicators included the percentage of students receiving Pell grants, tuition fees, scholarships, student debt and graduate employment.

Not so long ago there were proposals that the US News and World Report's law school rankings should include the percentage of ethnic minority students.

Meanwhile, at the International Rankings Expert Group conference in London in June there were calls for universities to be ranked according to their contributions to environmental sustainability

The Universitas 21 rankings of higher education systems now include a 2% indicator for female academics and 2% for female students.

So universities are to be judged for admitting disadvantaged students, seeing that they, along with others, graduate, making sure that graduates get jobs, making sure that every classroom (well, maybe not in the humanities) is suitably diverse for race, gender and gender-orientation (although perhaps not, one suspects, for religion or politics) and contributing to environmental sustainability. No doubt there will be others: third mission, LGBT friendliness, community engagement, transformativeness? Doing research and providing instruction in disciplines and professions are no longer enough.

Now, The Upshot blog at the New York Times has gone  further. American colleges and universities are apparently responsible for the limited literacy, numeracy and problem solving skills of the entire population of the USA, whether or not they have been anywhere near a university.

The writer, Kevin Carey, claims that US primary and secondary schools are performing badly compared to their international counterparts. American students do badly on the Programme for International Student Assessment (PISA), tests run by the Organisation for Economic Cooperation and Development (OECD).

" (T)he standard negative view of American K-12 schools has been highly influenced by international comparisons. The Organization for Economic Cooperation and Development, for example, periodically administers an exam called PISA to 15-year-olds in 69 countries. While results vary somewhat depending on the subject and grade level, America never looks very good. The same is true of other international tests. In PISA’s math test, the United States battles it out for last place among developed countries, along with Hungary and Lithuania"
There are, as noted by blogger Steve Sailer, substantial variations by race/ethnicity in the 2012 PISA tests. The average score for the three tests -- mathematics, reading, science -- is 548 for Asian Americans, just below Hong Kong and just ahead of  South Korea, 518 for White Americans, the same as Switzerland, 465 for Hispanics, comfortably ahead of Chile and Costa Rica, and 434 for African Americans, well ahead of Brazil and Tunisia. The US education system is doing an excellent job of raising and keeping the skills of African Americans and Hispanics well above the levels of Africa, the Arab World and Latin America. Scores for Whites are comparable to Western Europe and those for Asian Americans to Greater China, except for Shanghai which is, in several respects, a special case.

What American schools have not done is to close the attainment gap between African Americans/Hispanics and Whites/Asians (probably meaning just Northeast Asians). Until the skills gap between Latin America and Africa on the one hand and Germany and Taiwan on the other is closed, this will not be a a unique failing.

The post goes on to observe that Americans think that their higher education system is superior because there are 18 and 19 US universities respectively in the top 25 in the Times Higher Education (THE) and the Shanghai rankings. It does not mention that in the latest Quacquarelli Symonds (QS) rankings the number goes down to 15.

"International university rankings, moreover, have little to do with education. Instead, they focus on universities as research institutions, using metrics such as the number of Nobel Prize winners on staff and journal articles published. A university could stop enrolling undergraduates with no effect on its score.
We see K-12 schools and colleges differently because we’re looking at two different yardsticks: the academic performance of the whole population of students in one case, the research performance of a small number of institutions in the other." 

This is a little inaccurate. It is correct that the Shanghai rankings are entirely research based. The THE rankings, however, do have a cluster of indicators that purport to have something to do with teaching although the connection with undergraduate teaching is tenuous since the teaching reputation survey is concerned with postgraduate supervision and there is an indicator that gives credit for the number of doctoral students compared to undergraduates. But the THE rankings,along with those published by QS, are not exclusively research orientated and there is no evidence that American universities are uniquely deficient in undergraduate teaching.

If a university stopped admitting undergraduate students its score on the THE and QS rankings would rise since it would do better on the staff student ratio indicator. Eventually, when all enrolled undergraduates graduate, it would be removed from the rankings since undergraduate teaching is a requirement for inclusion in these rankings.

Carey continues with a discussion of  the results of the PIAAC (Programme for the International Assessment of Adult Competencies) survey, conducted by the OECD.


"Only 18 percent of American adults with bachelor’s degrees score at the top two levels of numeracy, compared with the international average of 24 percent. Over one-third of American bachelor’s degree holders failed to reach Level 3 on the five-level Piaac scale, which means that they cannot perform math-related tasks that “require several steps and may involve the choice of problem-solving strategies.” Americans with associate’s and graduate degrees also lag behind their international peers.

American results on the literacy and technology tests were somewhat better, in the sense that they were only mediocre. American adults were eighth from the bottom in literacy, for instance. And recent college graduates look no better than older ones. Among people ages 16 to 29 with a bachelor’s degree or better, America ranks 16th out of 24 in numeracy. There is no reason to believe that American colleges are, on average, the best in the world."


It is true that only 18 % of American bachelor degree holders reach numeracy level 4 or 5 on the PIAAC Programme for the International Assessment of Adult Competencies) survey compared to the OECD average of 24%. If , however, we look at those who reach levels 3* , 4 or 5, American bachelor degree holders do slightly better with 74 %, compared to the international average of 70%.

Looking at all levels of education, it is noticeable that for numeracy, Americans with less than a high school education do badly compared to their OECD counterparts. Nine per cent have reached level 3, 4 or 5 compared with the OECD average of 24%. This 15 point difference increases to 20 points for high school graduates and falls to 17 points for associate degree holders. For bachelor degree holders, Americans are 4 points ahead of the average and 4 points behind for graduate and professional degree holders.

For literacy, America universities are average or slightly better than average. American associate and bachelor degree holders have the same percentage reaching level 4 or 5 -- 14% and 24%  -- and graduate and professional schools are slightly ahead -- 33% compared to 32%.

For problem solving in technology-rich environments, Americans lag behind at all levels but the gap between Americans and the OECD average gradually diminishes from 10 points for those with less than a high school education and high school graduates to 6 for those with an associates degree, 4 for those with a bachelor's degree and 3 for those with graduate or professional degrees.

It seems unfair  to blame American universities for the limitations of those who have never entered a university or even completed high school.

There is nothing in the PIAAC to suggest that American universities are currently performing worse than the rest of the OECD. They are however lagging behind Japan, Korea, and greater China and this is beginning to be confirmed by the international university rankings.

In any case it is very debatable whether there is anything universities can do override the remorseless effects of  demography, social change, immigration and the levelling of primary and secondary education that are steadily eroding the cognitive abilities of the American population.

Even more striking, differences between the US and the rest of the developed world are relatively modest compared with those within the country.

Only 16% of White Americans are at level 4 or 5 for literacy but that is much better than the 3% of  Blacks and Hispanics. For numeracy the numbers are 12%, 1% and 2% and for problem solving 8%, 2% and 2%..

US universities are probably on average as good as the rest of the OECD although it could be argued that the advantages of language and money ought to make them much better. But they cannot be held responsible for the general mental abilities of the whole population. That is more much more dependent on demography, migration and social policy.

It is likely that as the assault on competition and selection spreads from primary and secondary schools into the tertiary sector, American universities will slowly decline especially in relation to Northeast Asia.




* Level 3: "Tasks at this level require the application of number sense and spatial sense; recognising and working with mathematical relationships, patterns, and proportions expressed in verbal or numerical form; and interpreting data and statistics in texts, tables and graphs."



Tuesday, July 01, 2014

Are International Students a Good Indicator of Quality?

From University World News

"Tens of thousands of foreign students with invalid language test scores have been exposed in a groundbreaking investigation in Britain, while 57 private further education colleges have been stripped of their licences. Three universities have been prohibited from sponsoring new international students pending further investigations, and some 750 bogus colleges have been removed from the list of those entitled to bring foreign students to Britain."

The three universities are West London, Bedfordshire and Glyndwr. 

The proportion of students who are international accounts for 5 % of the total weighting of the QS World University Rankings and 2.5 % of Times Higher Education's.

Monday, June 30, 2014

THE Asian University Rankings

Times Higher Education Asian University Rankings

Source

Scope
Universities in Asia, including Southwest Asia, Turkey and Central Asia but not Australasia or the Pacific region.

Methodology
Unchanged since last year. Same as the THE World University Rankings

Teaching: the Learning Environment 30%  (includes 5 indicators)
Research: Volume, Income, Reputation  30% (includes 3 indicators)
International Outlook 7.5% (includes 3 indicators)
Industry Income: Innovation 2.5%
Citations: Research Influence 30%

Top Ten

1.   University of Tokyo
2.   National University of Singapore
3.   University of Hong Kong
4.   Seoul National University
5.   Peking University
6.   Tsinghua University
7.   Kyoto University
8.   Korea Advanced Institute of Science and Technology
9=  Hong Kong University of Science and  Technology
9=  Pohang University of Science and Technology

Countries with Universities in the Top Hundred

Japan              20
China              18
Korea             14
Taiwan            13
India               10
Hong Kong      6
Turkey             5
Israel                3
Iran                  3
Saudi Arabia    3
Thailand           2
Singapore         2
Lebanon           1

Selected Changes

Hebrew University of Jerusalem down from 15th to 18th
Bogazici University, Turkey, up from 37th to 19th
Sogang University, Korea, down from 78th to 92nd
Panjab University, India,  from unranked to 32nd.
Keio University down from 53rd to 72nd