SRHE Blog

The Society for Research into Higher Education


Leave a comment

In defence of SoTL: anchoring educational evaluation and educational research

by Liz Austen

By the end of 2025 I had attended three HE related conferences: Euro SoTL, the Wonkhe Festival of HE and the SRHE Annual Conference. I presented on similar topics at all three events; what evidence do we generate to help us understand and act to enhance student experiences and outcomes in higher education? During the Wonkhe panel and my SRHE session, I defined two approaches at the disposal of HE practitioners:

Higher Education Evaluation: an approach which helps to understand and explore what works and doesn’t work in a given context and is of value to stakeholders. The aim of evaluation is to generate actionable evidence-informed learning, which encourages, informs and supports continuous improvement of process and impact (Evaluation Collective 2025)

Higher Education Research: to extend knowledge and understanding in all areas of educational activity and from a wide range of perspectives, including those of learners, educators, policymakers and the public (adapted from BERA, 2024)

At the Wonkhe panel, Clare Loughlin-Chow (CEO of SRHE) helpfully outlined the higher education research topics that were most prevalent in the SRHE journals. Omar Khan (CEO of TASO) then outlined the scope and priories of TASO, an affiliate member of the government’s What Works Network which focuses on higher education evaluation. My conceptual discussion of evidence generation brought the two together.

At EuroSoTL earlier in the year, my colleagues and I outlined our new institutional approach to the Scholarship of Teaching and Learning:

Scholarship of Teaching and Learning (SoTL): to improve student learning through engagement in the existing knowledge of teaching and learning, developing contextual ideas and innovation in practice, reflecting on practice, applying methodological rigour, working in partnership with students, and sharing of scholarship publicly (adapted from Felton (2013))

When I attended SRHE in December 2025, SoTL appeared in only one session I attended and some of this discussion focused on the challenges of bringing SoTL into spaces for educational research. My hand in the air comment – that criticism of SoTL by educational researchers was an example of ‘academic snobbery’ – certainly raised a few eyebrows. This blog post considers the relationship between these three approaches and whether, for the good of our students, it’s time for some reconciliation.

Educational evaluation, SoTL and educational research

Educational research in higher education has developed over the last 60 years. Interestingly, research into teaching and learning is cited as the most theorised by this type of research (Tight, 2012). Higher education evaluation, sometimes considered as applied research, was recently propelled by the Office for Students’ agenda to ‘evaluate, evaluate, evaluate (Office for Students, 2022). SoTL has developed alongside HE research and evaluation, emerging from Boyer’s work in 1990.

The aims of each endeavour are distinct, tied by the notion of ‘enquiry’. Research seeks to build new knowledge, and evaluation seeks to provide judgment on a contextual problem. SoTL has a narrower focus on teaching and learning than the broader scope of research and evaluation but incorporates prior knowledge and contextual problem solving through focused enquiry (Gray, 2025). SoTL builds on the foundation of social sciences methodology and can integrate disciplinary methodology into practitioner’s teaching and learning enquiry (Riddell, 2026). Educational evaluation often asks questions of the effectiveness of interventions, but in some teaching and learning spaces, the evaluative language of ‘intervention’ isn’t appropriate (Austen (2025) in Austen and McCaig (2025)). Exploring what works through SoTL enquiry aligns better. Often the bridging term ‘pedagogic research’ is used as integral to SoTL (close to practice) but distinct from educational research (broader anticipated impact). Our chosen SoTL definition uses neither research nor evaluation terminology, but has component parts – knowledge, innovation, method, dissemination – that are central to all.

The essential agents in educational research, evaluation and SoTL are the same – individual students (as partners, as participants and as voice givers), individual staff (academic and professional services), institutional groups or clusters, collaborating HEIs, and third space organisations. Reasons for enquiry are also similar and include sector expectations and shared learning, the desire for institutional enhancement and impact, personal development and career progression. Or as Ashwin & Trigwell (in Evans et al, 2021) note:  to inform a wider audience; to inform a group within context; to inform oneself. All research, evaluation and SoTL agents must navigate the practical and ethical considerations of ‘insider’ enquiry if they are exploring their own practices or within their institutional contexts (BERA, 2018; Barnett & Camfield, 2016).

Output pathways are also interconnected. The SoTL staircase (Beckingham, 2023) recognised the variety of outputs encouraged by SoTL and includes those traditionally aligned with research and evaluation (reports and journal articles). Research outputs may be guided by REF criteria, and evaluation outputs by readership. The conclusions in research articles frequently state that more research needed, and evaluation reports often sit unread in metaphorical desk draws. In comparison, SoTL practitioners benefit from publications which are close to practice, quicker to publish, and more likely to influence change.

Both educational evaluation and educational research are inherently theoretical, grounded by educational or pedagogic theory or a theory of change. SoTL is more action focused, less theoretical than research yet can be more exploratory than evaluation. In 2011, Kanuka questioned SoTL’s credibility due to the lack of theoretical underpinning or reference to existing scholarship. At times, I suggest that educational research can be positioned too far in the opposite direction. The presentations at SRHE were heavily theoretical and sometimes I was left thinking ‘so how would this work actually improve the learning experiences of students’? In contrast, the breadth of SoTL includes both theory and action, albeit in more pragmatic ways.

There are values and specific skill sets of educational researchers and evaluators (and often epistemological disagreements occur between the two). This commitment to identity can be excluding and may help to understand why SoTL has been challenged. Canning & Masika (2022) caution us on the ‘threat to serious scholarship’ posed by SoTL, which they believe risks devaluing research into higher education learning and teaching. Their criticism of ‘anything goes’ I would frame as an important approach to inclusion. Their criticism of the ‘watered-down version of teaching and learning research’ I frame as SoTL’s recognition of the developmental, particularly in building staff confidence. Where confusion over definitions and scope still occur, I question whether institutional SoTL has been well grounded or well led.

Conclusion

There is clearly a divide between higher education research and SoTL. There are few recent SRHE blog posts which reference SoTL at all and one that does advises against flag-in-the-sand nomenclature (Sheridan, 2019). Having spent a lot of time in these circles, I believe higher education evaluators are more agnostic, but I include them in this discussion as they bring a new dynamic to this debate.

In this blog I have identified the ways in which research, evaluation and SoTL have their own agendas and yet have much in common. I argue that SoTL emerges as a grounding anchor between higher education research and higher education evaluation. SoTL borrows from both. SoTL feeds into both. SoTL is more than both (Potter, 2025). SoTL’s inherent value is the ability to build a community which improves student experiences and outcomes in an enquiry led and timely way.

For more details on the approach to SoTL at Sheffield Hallam University see: https://lta.shu.ac.uk/scholarship

Reference

Riddell, J (2026) ‘Hope circuits in practice: how the scholarship of teaching and learning fuels pedagogical courage and systemic change’ Guest Lecture, Sheffield Hallam University

Liz Austen is Professor of Higher Education Evaluation and Associate Dean Learning, Teaching and Student Success at Sheffield Hallam University. She has worked as an independent Evaluation Consultant on HE sector contracts and is a regular keynote speaker on all things evaluation in HE. Her focus is on evidence informed practice across the student lifecycle. Liz also leads a cross sector HE network called the Evaluation Collective.

Image of Rob Cuthbert


Leave a comment

Tunnel vision: higher education policy and the Office for Students

by Rob Cuthbert

In January 2022 the Office for Students published three sets of consultations, 699 pages of proposals for the regulation of student outcomes, the determination of teaching excellence, and the construction of indicators to measure student experience and outcomes. These were not separate initiatives, but part of a co-ordinated programme which needs to be seen in the context of the long-awaited government response to the 2019 Augar report, finally published in March 2022[1].

The OfS consultation announced that numerical thresholds will underpin requirements for minimum acceptable student outcomes at both undergraduate and postgraduate level. Universities and colleges not meeting these could face investigation, with fines and restrictions on their access to student loan funding available as potential sanctions. For full-time students studying a first degree, the thresholds require: 80% of students to continue into a second year of study; 75% of students to complete their qualification; 60% of students to go into professional employment or further study.

Not just numbers? OfS say: “we recognise that using our indicators to measure the outcomes a provider delivers for its students cannot reflect all aspects of the provider’s context … If a provider delivers outcomes for its students that are below a numerical threshold, we will make decisions about whether those outcomes are justified by looking at its context. This approach would result in a rounded judgement about a provider’s performance.”

But then: “… such an approach may present a challenge for some providers. This is because they must only recruit students where they have understood the commitment they are making to support their students to succeed, irrespective of their backgrounds. … Most universities and colleges relish this challenge and already deliver on it. However, some do not. While some may offer opportunities for students to enter higher education, we also see low continuation and completion rates and disappointing levels of progression to relevant employment or further study.” A warning, then, for “some”, but not “most”, providers.

The OfS approach will be fine-grained: “We would consider whether a provider has complied with condition B3 in relation to each separate indicator or split indicator. This enables us to identify ‘pockets of provision’ where performance in a specific subject, for students with specific characteristics, or in relation to partnership arrangements, falls below a numerical threshold”.

‘Selecting’ universities might think that ‘contextual judgment’ will rescue them, but may still decide to play safe in subjects where the numbers don’t look so good. ‘Recruiting’ universities, especially in ‘levelling up’ areas, might be looking at the numbers across many programmes and considering their strategy. Everyone will be incentivised to play safe and eliminate what are numerically the most marginal candidates, subjects and courses. And everyone thinks this will discriminate against disadvantaged students. For example, the University Alliance response published on 16 March 2022 said: “The University Alliance is gravely concerned that the proposals outlined by government could have unintended consequences for the least privileged students in society.”

Sally Burtonshaw (London Higher) blogged for HEPI on 26 January 2022: “As the dust begins to settle on the 699 pages of Office for Students’ (OfS) consultations and accompanying documents published on Thursday and providers across the sector begin to draft responses (deadline March 17th), it feels like there is a gaping chasm between the sector and its regulator. Language in the accompanying press release with references to ‘crack downs’, ‘tough regulatory action’ and ‘protecting students from being let down’, jars with a sector which has contributed so much throughout the pandemic.”

Diana Beech (London Higher) blogged for HEPI on 7 March 2022 about the government response to Augar and the OfS consultations: “… what we are facing now is not a series of seemingly independent consultations concerned with the minutiae of regulation, but a multi-pronged and coordinated assault on the values our higher education sector holds dear.” Diana Beech was a policy adviser to the last three ministers for universities.

SRHE Fellow Peter Scott summed it up like this: “This … ‘direction of travel’ is … based on the assumption that we should continue to distinguish between FE and HE, vocational and academic tracks, in terms of their social bases and costs. Of course, that is the current reality. Universities, especially Russell Group ones, draw a disproportionate number of their students from socially-privileged backgrounds, while FE is badly under-funded. This is why it makes (economic) sense for the Government to try to divert more students there. But is that sustainable in a country that aspires to being both democratic and dynamic? Most other countries have moved on and now think in terms of tertiary systems embracing HE, FE, on-the-job training, adult and community learning, the virtual stuff … bound together by flexible pathways and equitable funding – and, above all, by fair access. In the UK, Wales is setting the pace, while Scotland has had its ‘Learner Journey 15-24’ initiative. In England, sadly, there is no echo of such positive thinking.”

Status hierarchies must, it seems, be maintained, and not just between HE and FE, but also between universities. Contrary to expectations the Teaching Excellence Framework will rise from the ashes of the Pearce Review via the OfS’s second consultation. Earlier versions of TEF did not reliably reproduce the existing status hierarchies; some Russell Group institutions even suffered the indignity of a bronze rating. Clearly this could not be allowed to continue. So now: “The proposed TEF process is a desk-based, expert review exercise with decisions made by a panel of experts to be established by the OfS. The panel would consider providers’ submissions alongside other evidence. … TEF assessment should result in an overall rating for each provider. The overall rating would be underpinned by two aspect ratings, one for student experience and one for student outcomes but there would be no rating of individual subjects within a provider.” Such undifferentiated provider-level arrangements will surely be enough to ensure no further embarrassment for those with the highest reputations.

There will still be gold, silver and bronze awards, but not for all. The OfS script is worthy of Yes Minister: “… our minimum baseline quality requirements establish a high quality minimum for all providers. Therefore, quality identified that is materially above the relevant baseline quality requirements should be considered as ‘very high quality’ or ‘outstanding quality’ … ‘Outstanding quality’ signifies a feature of the student experience or outcomes that is among the very highest quality found in the sector for the mix of students and courses taught by a provider. … ‘Very high quality’ signifies a feature of the student experience or outcomes that is materially above the relevant minimum baseline quality requirements for the mix of students and courses taught by a provider.” Is the difference clear? If not, don’t worry, because the TEF Panel will decide.

As Sir Humphrey might have put it: it’s like the Olympics – not everyone will get on the podium. And it’s like ice dancing: judges hand out the marks based on how they rate the performance. The table of “features of excellence” spells out the criteria, for example: “The provider uses research in relevant disciplines, innovation, scholarship, professional practice and/or employer engagement to contribute to an outstanding academic experience for its students.” Whereas for high quality: “The provider uses research in relevant disciplines, innovation, scholarship, professional practice and/or employer engagement to contribute to a very high quality academic experience for its students.” Is the difference clear? If not, don’t worry, because the TEF Panel will decide.

Nick Hillman blogged for HEPI on 21 January 2022 about the OfS initiatives, reflecting on the limited success of previous attempts to shift evaluation towards metricisation, and Debbie Mcvitty blogged for Wonkhe on 24 January 2022 with a helpful potted history.There will be no surprises in the outcomes of the consultations. Whether or not the Titanic is sinking, we are consulted only on how to arrange the deckchairs. As HEPI’s Nick Hillman said: “I vividly recall what Les Ebdon, the former Director for Fair Access, said a few years ago when he was asked, “What will the Office for Students do?” His answer was, “It’s very simple. I can tell you exactly what the OfS will do. It will do whatever the government of the day wants it to do.” And so it has proved.”

Let us, then, look not at the entirely predictable outcomes, but at the style the OfS has adopted to reach them. The consultation on regulation of outcomes is telling. It takes 100 pages to assemble a rational-bureaucratic edifice in rational-bureaucratic language, with chapter headings including: “… making judgments about compliance with condition B3 … Addressing statistical uncertainty in the assessment of condition B3 … Taking regulatory action when a breach is identified …”. There could have been headings like: “How do we know how good the performance is?” or “What if something goes wrong?”. But that would have exposed the deeper questions, for which answers have already been decided. Instead we are drowned with bureaucratic detail. Details are always necessary, but we should be reminded of why they are needed. Instead these documents do their best to obscure the fait accompli which is their starting point, with a grinding remorseless pseudo-rationality which encourages you to lose sight of purposes and values.

In 699 pages of consultation the OfS has done its bureaucratic best to profess transparency, openness and rigour, while diverting our energies and attention from what an experienced ministerial adviser called the ‘assault on the values which our HE sector holds dear’. The consultations amount to a detailed enquiry about how exactly these values should be assaulted. We are in a consultation tunnel with only one track. What we can see is probably not the light at the end of the tunnel, it may be the lights from an oncoming train.

Rob Cuthbert, editor of SRHE News and Blog, is emeritus professor of higher education management, Fellow of the Academy of Social Sciences and Fellow of SRHE. He is an independent academic consultant whose previous roles include deputy vice-chancellor at the University of the West of England, editor of Higher Education Review, Chair of the Society for Research into Higher Education, and government policy adviser and consultant in the UK/Europe, North America, Africa, and China.

Email rob.cuthbert@uwe.ac.uk, Twitter @RobCuthbert.


[1] Covered elsewhere in this issue of SRHE News. SRHE members can read this and previous editions of SRHE News via https://srhe.ac.uk/my-account/