SRHE Blog

The Society for Research into Higher Education


Leave a comment

The missing middle ground between research-led and practice-led education

by Saeed Talebi and Nick Morton

A peer reviewer recently challenged our pedagogical approach. We had described embedding an industry-led research project on Digital Twin development into our built environment curriculum as ‘research-informed teaching’. The reviewer disagreed: this was ‘practice-led rather than research-informed,’ they argued, because students weren’t producing research outputs themselves.

The comment revealed a conceptual confusion we suspect is widespread in higher education. We often assume that if students aren’t producing original research, then any industry-focused teaching must simply be vocational training with academic window-dressing. This leaves practice-facing disciplines in an awkward position: industry engagement is essential to what we do, but it risks being dismissed as less scholarly. There is, however, a middle ground.

Healey and Jenkins’ (2009) model offers a useful way through this confusion. They identify four modes of engaging undergraduates with research: research-led (learning about current scholarship), research-oriented (learning research methods), research-based (undertaking inquiry), and research-tutored (engaging in research discussions). These are mapped across two dimensions: whether students are positioned as audience or participants, and whether the emphasis falls on research content or processes. The model’s key insight is that students can be meaningfully engaged with research even when they aren’t producing research outputs themselves. The question isn’t simply whether students are ‘doing research’, it’s whether they’re positioned as passive recipients of established knowledge or as active participants in scholarly inquiry.

Practice-led teaching operates on different logic, though that logic has a closer relationship to applied research than is sometimes acknowledged. Its primary aim is developing professional competence through authentic engagement with messy problems and competing stakeholder priorities. The distinction isn’t whether industry is involved – it can be present in both approaches. The distinction lies in how students are positioned in relation to knowledge. In practice-led education, knowledge tends to be treated as relatively settled. In research-informed education, knowledge is contested, evolving, and open to question. An opportunity arises when these approaches coincide without conscious design, and a risk emerges when they collapse into one another. Research-informed teaching can become performative, referencing staff publications without changing how students learn. Practice-led teaching can slip into employability theatre, where live briefs are added without interrogating what knowledge students are actually developing.

As Professor Hanifa Shah OBE recently argued in Times Higher Education, STEAM education at its best equips students to “move fluidly between analytical and imaginative modes of thinking“, asking critical questions, considering ethical implications, and bringing meaning to innovation. This is precisely the disposition that research-informed teaching seeks to develop. In STEAM disciplines, including architecture, built environment, computing and engineering, emerging technologies create spaces where research and practice intersect meaningfully. Digital Twins and real-time monitoring tools, for example, allow students to work with live systems while engaging critically with the assumptions and ethics embedded within them. Students aren’t merely applying research after the fact, nor mimicking professional routines. They’re learning to question how data is generated, how models simplify reality, and how decisions are shaped by both evidence and judgement. Practice becomes a site of inquiry.

There’s an institutional dimension here too. Across the sector, promotion frameworks, workload models, and teaching quality metrics often reward research visibility and industry engagement without asking how either is translated pedagogically. Academics are encouraged to ‘bring research into teaching’ and ‘embed employability’, yet rarely supported in doing the difficult design work that meaningful integration requires. Recent discussions within the sector have highlighted how delivery models shape the possibilities for integrating academic and workplace learning. These are sector-wide conversations, and they reflect shared challenges around diverse learner cohorts, blended delivery, and the risk of compliance overtaking genuine learning. As a result, many innovative practices remain dependent on individual effort rather than structural support.

None of this means practice-led and research-informed approaches are mutually exclusive. The most effective curricula often blend elements of both. But blending deliberately is quite different from conflating accidentally.

When designing industry-engaged teaching, it’s worth asking honest questions. Are students positioned as inquirers or executors? Are they engaging with contested knowledge or settled practice? Does assessment reward critical reflection or merely competent performance? Is the industry project a vehicle for scholarly inquiry, or is scholarly framing a veneer over vocational training?

The answers won’t always be clear-cut, and that’s fine. But asking the questions helps us design with intention rather than stumbling into confusion – and helps us articulate what we’re doing when a peer reviewer, a sceptical colleague, or a university committee asks us to justify our approach.

Dr Saeed Talebi is an Associate Professor in the Department of Architecture and Built Environment at Birmingham City University and a Senior Fellow of the Higher Education Academy (SFHEA). He has held a number of T&L leadership roles, including Departmental Lead, Course Leader, and Academic Lead for Teaching Excellence and Student Experience. He has a keen interest in pedagogy in higher education, with particular interest in research-informed teaching and the integration of emerging technologies and practice-led projects into built environment curricula to enhance student outcomes and experience. He has also led the delivery of large STEAM research projects.

Professor Nick Morton is the Academic Director of Partnerships and STEAM at Birmingham City University. A Principal Fellow of the Higher Education Academy (PFHEA), he was awarded a National Teaching Fellowship in recognition of his track record in curriculum development. He has held a number of senior leadership roles at BCU, including Associate Dean for Teaching Education and Student Experience, overseeing Computing, Engineering and the Built Environment. He was elected Vice-Chair of the Council of Heads of the Built Environment (CHOBE) in 2012 and is a Fellow of the Royal Institution of Chartered Surveyors (FRICS).


Leave a comment

Can folk pedagogies help us understand the limited impact of research on higher education?

by Alex Buckley

The SRHE conference is a great place to see our field in all its glory. From the sessions I attended in December 2025, one thing that was abundantly clear was the desire of so many HE researchers to change the world. A distinctive feature of contemporary HE research – reflecting the social sciences more broadly – is the focus on political and ethical issues, with avowedly political and ethical intentions. The improvement of society is often the explicit end, rather than the more humble improvement of our own part of the education system.

Despite this desire to make a difference, higher education research has for many years been held up as an area where the impact of those working in the field is not what it could be. As George Keller said in 1985, “hardly anyone in higher education pays attention to the research and scholarship about higher education”,

Asking the right questions?

There hasn’t been a lot of work on the gap between research and practice in HE – though there is a fair amount in the schools sector from which we can extrapolate, to a greater or lesser extent – but one issue that has received some attention is the fundamental one: are researchers actually asking the right questions?

Vivianne Robinson is a researcher who has laid a substantial amount of blame at the feet of researchers, who “have little to offer by way of alternative solutions, when the problems they have been studying are not those of the practitioner” (Robinson 1993). I have recently used Robinson’s model of Problem-Based Methodology to explore whether research about exams in higher education does engage sufficiently with the challenges that teachers take themselves to face. The results were not encouraging.

One of the more straightforward of Robinson’s criteria for impactful research is that researchers should be addressing teachers’ beliefs, and correcting them where they are erroneous. That’s important, but what if those beliefs are hard to shift? We all have stubborn hunches about how higher education works: good ways of motivating students, how to write feedback that will make students pay attention, how to clearly communicate complex ideas. What if there are teacher beliefs that are deeply embedded, so deeply that we don’t always know we have them, but that aren’t helping us and need to change?

One idea that has been explored in the school sector, but has largely passed us by, is the concept of ‘folk pedagogies’. This idea was developed in the 1990s as an extension of the more famous concept of ‘folk psychologies’: the tacit theories that we all have that allow us to make sense of people’s behaviour. For Jerome Bruner, a natural next step from folk psychologies was the idea that we have intuitive theories about how people learn.

“Watch any mother, any teacher, even any babysitter with a child and you’ll be struck by how much of what they do is steered by notions of ‘what children’s mind are like and how to help them learn,’ even though they may not be able to verbalise their pedagogical principles.” Bruner (1996)

There has been some research in the school sector about the implications of this idea, particularly in terms of how much difference research makes to educational practice. Folk pedagogies have two features that will make them a factor in the impact of education research: they interfere with the uptake of new research-based ideas and approaches, and they are stubborn. On the first point, the idea is that new ideas about higher education will have to replace the old if they are to influence teachers; and on the second, evidence suggests that even where trainee teachers have ostensibly internalised more scientific theories of learning, the folk pedagogies come creeping back.

In the case of higher education, what might these commonsense, intuitive theories look like? They might just be very general ideas about how people learn, applied to the particular context of higher education. Bruner identifies a range of broad folk pedagogical views, such as one which sees ‘children as knowers’, with a focus on the gathering and organising of facts. Perhaps one kind of folk psychology of higher education would be the application of that idea specifically to students in universities rather than other sectors: a focus on the selection, organisation and retention of propositional knowledge within degree programmes. Perhaps there are also specific intuitive theories about higher education that influence teachers’ practices. Perhaps there is a folk intuition that university students should not be spoon-fed – that they must take responsibility for their own learning and seek to develop their own views. Perhaps there is a folk intuition that students should encounter challenging views that encourage them to question their own certainties. In the absence of research, we can only speculate (and introspect).

Respecting the ‘folk’

The idea that teachers have deep intuitions about how students learn, that those intuitions can prevent them from acting on more evidence-based beliefs, and that those intuitions are hard to shake; none of those ideas are particularly earth-shattering. They are probably common sense among those researching and enhancing higher education. The value of the idea of ‘folk pedagogies’ lies instead in the way that it encourages us to take those intuitions seriously, both as an object of study and a powerful barrier to change.

Rather than dismissing intuitions about higher education – as ignorant beliefs and hide-bound traditions – we can study them. What are they? Where do they come from? How do they change? The idea of folk pedagogies is not pejorative. There’s no shame in having intuitions about how learning works. As with folk psychological theories, they are necessary parts of how we navigate the world, and something we can’t do without. There is also deep wisdom to be found in those intuitions, even if they are sometimes misleading. Research goes wrong by departing from common sense, at least as much as the other way around.

Acknowledging the existence of folk theories of higher education can help improve the impact of our research in all sorts of ways. We can research them, to understand why teachers and students (and others) do what they do, and the conditions in which deep intuitions can change. It can help us understand where – and why – research has departed so far from common sense as to be of little practical relevance.

It can also help us understand the scale of the challenge. In much of what we do, we’re seeking to modify what university teachers do, which very often means changing how they think. The reality is that we aren’t usually changing superficial, specific beliefs, at least not where the improvements we’re seeking are substantive. We’re changing deep beliefs picked up over a lifetime. Our model of improvement may then need to fit the old adage: if you’re not making progress at a snail’s pace, you’re not making progress. That’s a bit different from annual quality enhancement cycles or short-term strategic initiatives. We can change the world, but it will take time.

References

Bruner, J. (1996). The culture of education. Harvard

Robinson, V. M. J. (1993). Problem-Based Methodology: Research for the Improvement of Practice. Pergamon Press

Dr Alex Buckley is an Associate Professor in the Learning & Teaching Academy at Heriot-Watt University, Scotland. His research is focused on conceptual aspects of research and practice in assessment and feedback.


1 Comment

Becoming a professional services researcher in HE – making the train tracks converge

by Charlotte Verney

This blog builds on my presentation at the BERA ECR Conference 2024: at crossroads of becoming. It represents my personal reflections of working in UK higher education (HE) professional services roles and simultaneously gaining research experience through a Masters and Professional Doctorate in Education (EdD).

Professional service roles within UK HE include recognised professionals from other industries (eg human resources, finance, IT) and HE-specific roles such as academic quality, research support and student administration. Unlike academic staff, professional services staff are not typically required, or expected, to undertake research, yet many do. My own experience spans roles within six universities over 18 years delivering administration and policy that supports learning, teaching and students.

Traversing two tracks

In 2016, at an SRHE Newer Researchers event, I was asked to identify a metaphor to reflect my experience as a practitioner researcher. I chose this image of two train tracks as I have often felt that I have been on two development tracks simultaneously –  one building professional experience and expertise, the other developing research skills and experience. These tracks ran in parallel, but never at the same pace, occasionally meeting on a shared project or assignment, and then continuing on their separate routes. I use this metaphor to share my experiences, and three phases, of becoming a professional services researcher.

Becoming research-informed: accelerating and expanding my professional track

The first phase was filled with opportunities; on my professional track I gained a breadth of experience, a toolkit of management and leadership skills, a portfolio of successful projects and built a strong network through professional associations (eg AHEP). After three years, I started my research track with a masters in international higher education. Studying felt separate to my day job in academic quality and policy, but the assignments gave me opportunities to bring the tracks together, using research and theory to inform my practice – for example, exploring theoretical literature underpinning approaches to assessment whilst my institution was revising its own approach to assessing resits. I felt like a research-informed professional, and this positively impacted my professional work, accelerating and expanding my experience.

Becoming a doctoral researcher: long distance, slow speed

The second phase was more challenging. My doctoral journey was long, taking 9 years with two breaks. Like many part-time doctoral students, I struggled with balance and support, with unexpected personal and professional pressures, and I found it unsettling to simultaneously be an expert in my professional context yet a novice in research. I feared failure, and damaging my professional credibility as I found my voice in a research space.

What kept me going, balancing the two tracks, was building my own research support network and my researcher identity. Some of the ways I did this was through zoom calls with EdD peers for moral support, joining the Society for Research into Higher Education to find my place in the research field, and joining the editorial team of a practitioner journal to build my confidence in academic writing.

Becoming a professional services researcher: making the tracks converge

Having completed my doctorate in 2022, I’m now actively trying to bring my professional and research tracks together. Without a roadmap, I’ve started in my comfort-zone, sharing my doctoral research in ‘safe’ policy and practitioner spaces, where I thought my findings could have the biggest impact. I collaborated with EdD peers to tackle the daunting task of publishing my first article. I’ve drawn on my existing professional networks (ARC, JISC, QAA) to establish new research initiatives related to my current practice in managing assessment. I’ve made connections with fellow professional services researchers along my journey, and have established an online network  to bring us together.

Key takeaways for professional services researchers

Bringing my professional experience and research tracks together has not been without challenges, but I am really positive about my journey so far, and for the potential impact professional services researchers could have on policy and practice in higher education. If you are on your own journey of becoming a professional services researcher, my advice is:

  • Make time for activities that build your research identity
  • Find collaborators and a community
  • Use your professional experience and networks
  • It’s challenging, but rewarding, so keep going!

Charlotte Verney is Head of Assessment at the University of Bristol. Charlotte is an early career researcher in higher education research and a leader in within higher education professional services. Her primary research interests are in the changing nature of administrative work within universities, using research approaches to solve professional problems in higher education management, and using creative and collaborative approaches to research. Charlotte advocates for making the academic research space more inclusive for early career and professional services researchers. She is co-convenor of the SRHE Newer Researchers Network and has established an online network for higher education professional services staff engaged with research.