By Paul Temple
Where did the obsession with comparisons in education come from? In his 1997 book The Audit Society, Michael Power identifies the causes of what he calls “the audit explosion” and the related demands for public-sector performance measurement in the 1980s and 1990s. The shock-wave of this explosion ripples on.
But one recent case, the OECD’s AHELO (Assessment of Higher Education Learning Outcomes) project, shows there may be limits to the comparison industry’s growth. This project seems to have stalled following concerns about its proposed methodology and likely costs: England said last year it wouldn’t take part, and American and Canadian universities have also said no. The OECD’s Institutional Management in Higher Education (IMHE) programme recommended in 2012 that the project be halted after seeing the results of a large-scale feasibility study. But the OECD’s Director of Education and Skills, Andreas Schleicher, is apparently undeterred, if his HEPI lecture (Value-Added: How do you measure whether universities are delivering for their students? HEPI 2015 Annual Lecture. HEPI Report 82) last December is anything to go by.
Schleicher thinks that his plans are being blocked by “political economy”: posh universities don’t want to hear that “we may not look as beautiful as we thought, or as beautiful as others have told us we are.” This is no doubt true – who does want to hear that? – but the AHELO proposal as Schleicher now outlines it is so beset with difficulties that it is surprising it has got as far as it has.
For a start, the project depends on some fairly remarkable assumptions about institutional behaviour. As Schleicher describes it, individual universities would set student learning outcomes on a subject basis and the extent to which these are achieved will offer “profound insights into the effectiveness of teaching and learning” in the department. (Note, though, that these subject assessments will then need to be somehow aggregated because AHELO would “[use] the institution as the unit of comparison”.) There may be some universities where there would be a discussion on the lines of, “We must set very demanding learning outcomes, even if it means that the metric will show that many of our students are failing”. If you know of such a university, do tell.
Schleicher seems, then, to discount the possibility that universities will game the process – even if they believed that their data might be used in a resource allocation process perhaps called – oh, I don’t know – the Teaching Excellence Framework. He offers, rather charmingly, an external check here by suggesting that alumni are asked for their views on what they learned as students: though he concedes that this might not be “sufficient as a proxy for learning outcomes”. Goodhart’s Law – when a measure becomes a target, it ceases to be a useful measure – casts its shadow over all this. (The 2012 AHELO feasibility study undertook centralised, online testing of a sample of students in two disciplines from different countries, on the PISA model. What Schleicher discussed in his lecture was a different proposition.)
Another difficulty is the practical problem of measuring outcomes. Setting the learning outcomes for a particular academic programme is the easy part: the hard part, which takes up most of the time of exam boards, is determining the extent to which a particular student has achieved them. At undergraduate level (the focus of AHELO), there will be some fields where this may be more straightforward than others, but hardly any in which no dispute is possible. Moreover, Schleicher accepts that learning at university involves developing “transversal skills” – critical thinking, communication skills, and so on – which need to be covered in the measurement of learning outcomes. So the physics exam board not only needs to judge whether or not a student has grasped the second law of thermodynamics, but also the extent of the transversal skills developed during the study thereof. These results will then feed in to an overall university score – which will be compared with university scores from China, Chad and Chile.
So, good luck with that, Andreas!
SRHE member Paul Temple, Centre for Higher Education Studies, UCL Institute of Education, University College London