srhe

The Society for Research into Higher Education

Marcia Devlin


Leave a comment

Making space for compassion

by Marcia Devlin

As is the case in many countries, the COVID pandemic continues to wreak havoc in Australian universities. While many universities are now beginning to experiment with ‘hybrid’ models that combine online and face-to-face teaching and learning, efforts are tentative. Executives and staff are nervous about committing to what many students are increasingly telling us they want – a ‘normal’ university student experience with on-campus components.

This nervousness is well-founded. Our vaccine rollout is not rolling, the federal and state governments are blaming each other, and we have had to have another snap lockdown just this week. Ensuring ‘COVID-safe’ campuses in these circumstances is tricky and, not to put too fine a point on it, connected to potential life and death scenarios.

International borders remain closed. International students – so important to Australian universities and their finances – are not allowed into the country. The current Education Minister gave a speech about the future of international students this week. It was invitation only but from what can be gleaned from social media commentary from those fortunate enough to secure an invitation, it didn’t leave audience members brimming with confidence about the immediate future.

In this set of circumstances, it is challenging to focus on the core university ‘businesses’ of teaching and research. This challenge is exacerbated by the fact that those providing the education and research are human beings who are themselves living in and through the pandemic. The work of academics and professionals in universities is complex, messy, deeply human and relies on individual passion and goodwill as well as qualifications, knowledge, skills and experience.

I attended a seminar recently at one of my alma maters, Macquarie University, led by a well-known Australian author, Hugh McKay: the importance of compassion was central. Arguing that the most significant thing about us as people is that we share a common humanity, that we humans all belong to a social species, that we are “hopeless” in isolation and that we need others to nurture and sustain us, McKay underscored the importance of compassion, kindness and simply being nice to one another in our current shared pandemic context.

I’m not sure about other SRHE readers, but compassion and kindness aren’t topics I’ve often heard discussed in universities in my 30 years in the sector. McKay suggested the pandemic has been a mass experiment around what happens to people when they are isolated. The results have included more anxiety, more suicidal ideation, more domestic violence, among many other negative outcomes. But also more time for introspection and for deep consideration of what is important to us. Many of us have more clearly understood how crucial our social and personal connections are.

McKay proposes that many of us have previously found useful hiding places in ambition, IT devices and consumerism, which have promoted individualism and competitiveness and a greater focus on ourselves than on our role in families, communities and society. As I reflected on university life, and life more generally, I couldn’t help but think he had a point.

As we co-create the ‘COVID-normal’ university, I wonder if we might all find a bigger space for our humanity, our compassion and our kindness to each other. Not only might that bring a better experience of work in universities for ourselves and those around us, the quality and impact of our education and research might also improve as a result.

Former Senior Deputy Vice-Chancellor Marcia Devlin is a Fellow of SRHE and an Adjunct Professor at Victoria University, Melbourne, Australia.


Leave a comment

Dupery by Design

by Petar Jandrić

Since the election of a number of right-wing populist governments across the world, there have been increasing concerns that fake news in online platforms is undermining the legitimacy of the press, the democratic process, and the authority of sources such as science, the social sciences and qualified experts. The global reach of Google, YouTube, Twitter, Facebook, and other platforms has shown that they can be used to spread fake and misleading news quickly and seemingly without control. In addition to their power and reach, these platforms operate, and indeed thrive, in what seems to be an increasingly balkanised media eco-system where networks of users will predominantly access and consume information that conforms to their existing worldviews. Conflicting positions, even if relevant and authoritative, can be suppressed, discredited or overlooked as a result of filter bubbles and echo chambers.

Digital technologies have contributed to the prolific spread of false information, encouraged ignorance in online news consumers, and fostered confusion about how to determine fact from fiction. These same technologies have, however, permitted marginalised voices to be heard (transgender and autistic communities, victims of street harassment, for example), encouraged diversity, facilitated error detection and investigative accountability, and challenged privilege and prejudice. This opens up myriad questions such as:

  • How are online platforms designed to exploit particular vices such as close-mindedness, epistemic nihilism, insouciance, etc. and contribute to the power and dissemination of deception?
  • Deception: what is it? Is there anything peculiar about the times in which we live that should raise special concerns about the proliferation of fake news, lies, bullshit and other such vices online?
  • How do our individual and collective epistemologies interact with digital technologies to produce deceit?
  • How can we counter epistemic vices online, and protect ourselves and our institutions from their potentially baneful effects?
  • Can deception ever be justified? Is there anything to be learned from mass propaganda and deceit in other historical periods?

The epistemology of deceit in a postdigital era

To address these and related questions, Alison MacKenzie, Jennifer Rose, and Ibrar Bhatt have edited a book The Epistemology of Deceit in a Postdigital Era: Dupery by Design. The book offers strong theoretical and philosophical insight into how digital platforms and their constituent algorithms interact with belief systems to achieve deception, and how related vices such as lies, bullshit, misinformation, disinformation, and ignorance contribute to deception. This inter-disciplinary collection explores how we can better understand and respond to these problematic practices.

Continuing editors’ earlier work in the Special Issue of Postdigital Science and Education, ‘Lies, Bullshit and Fake News Online: Should We Be Worried?’, the contributors to the collection discuss the diverse ways in which deception is a pervasive feature of our communicative lives. Among the issues explored are how the design and infrastructure of digital platforms enable (or disable us from distinguishing between) what is true and truthful; fake or real; informative, disinformative or misinformative, malinformative, and other such information disorders. The scale of the dupery impacts on human rights, individual freedoms and dignity, agency and autonomy, in addition to the harms mentioned above.

The role of higher education is critical within this context, as universities have traditionally been regarded as sites of epistemic authority where knowledge is created and disseminated through the work of academics and theoretically grounded systems of teaching. Recent trends have shown that universities market the idea that an education through them will create ‘future-ready’, ‘globally-aware’ and ‘critically-thinking’ graduates, equipped with the relevant skills and knowledge to deal with issues facing our modern world, including public health crises, climate change and conflict.

The book was launched at a successful SRHE event held on 16 March 2021, in which editors, authors, and more than 100 members of the public engaged in a vivid discussion.

What is next?

These days, there is really interesting research taking place in different fields about post-truth and online deceit. Closer to higher education, and interesting example is Michael A Peters, Sharon Rider, Mats Hyvönen, and Tina Besley’s popular book Post-Truth, Fake News: Viral Modernity & Higher Education, which discusses the meaning and purpose of higher education in a ‘post-truth’ world.

Aided by a unifying postdigital theoretical framework which holds that human beings are systematically embedded in digital infrastructures, Alison MacKenzie, Jennifer Rose, and Ibrar Bhatt in The Epistemology of Deceit in a Postdigital Era: Dupery by Design make a unique contribution by reaching interdisciplinary boundaries to explore, examine and counter online deception, and analysing the power of social platforms and their role in the proliferation of epistemic harms. This line of inquiry is in its early days, and it will be very interesting to see where it will develop in the future.

Petar Jandrić is a Professor at the University of Applied Sciences in Zagreb (Croatia), Visiting Professor at the University of Wolverhampton (UK), and Visiting Associate Professor at the University of Zagreb (Croatia). His research interests are focused to the intersections between critical pedagogy and information and communication technologies. He co-authored the chapter ‘Scallywag Pedagogy’ with Peter McLaren (Chapman University, California) in Post-Truth, Fake News: Viral Modernity & Higher Education. pjandric@tvz.hr


1 Comment

Widening participation, student engagement, alienation, trauma and trust

Caroline S Jones and Zoë Nangah

Social mobility target setting and progression data collection have long been on the agenda for UK HE policy makers and are widely documented, debated and researched (Connell-Smith and Hubble, 2018; Donnelly and Evans, 2018; Social Mobility Commission 2017, 2019; Phoenix, 2021). Widening Participation (WP) policy underpins much Government target setting, dressed up as a key factor in improving the nation’s social mobility issues.  Much of the work undertaken in this field focuses upon the recruitment of students from the WP demographic onto Higher Education (HE) programmes, with data tracking at key points of the student’s journey as a measuring tool (Vignoles and Murray, 2016; Robinson and Salvestrini, 2020; Phoenix, 2021).  However, there appears to be a distinct lack of focus on the student as an individual human being, who arrives into the HE world with prior lived experience, and a lack of consideration of the impact of future life experiences aligned to the student’s individual psychological status.

This omission can have a profound effect on a student’s ability to engage in their programme of study, thus affecting their ability to progress and succeed, contributing to barriers to engagement (Jones and Nangah, 2020). On-entry assessment currently does not capture the presence of traumatic histories, and students may not feel able to fully disclose their experiences until they have established a tutorial connection. Furthermore, HE systems may not have access to information, either on-entry or during studies, that enables appropriate tutorial support and adequate referral, due to GDPR (2018) restrictions and confidentiality principles. Therefore, academic tutorial expertise and understanding how to support students from a psychological perspective might need to be considered using specific relational elements in a humanistic manner. At system level, internal and external support for students focusing on their holistic needs might also improve access and progression.

These ideas led us to conduct a deeper investigation into the psychological needs of students, to seek out methods, practices and potential policy changes which might reduce barriers to student engagement. This new knowledge could enable policy makers, HEIs, HE staff and departments to improve their current practice and  strengthen progress in terms of the national social mobility agenda (Augar, 2019). Examining barriers to student engagement for the WP demographic and specifically focusing on the links between psychological alienation theory (Mann, 2001), trauma and trust (Jones, 2017) in the HE context, led us to this new angle on the conundrum of meeting social mobility targets. Furthermore, recent neurological research, such as brain and amygdala responses to threat within specific groups (Fanti et al, 2020), could be explored further within HE student populations. Students who are affected by trauma could be better supported by using research-informed practices that can then be embedded in HE, focused on individual requirements.

To making a difference to current social mobility rates and targets we need to explore new concepts to inform and drive change in the sector. Our systematic literature review (Jones and Nangah, 2020) focused on the analysis of links between alienation theory (Mann, 2001; Jones, 2017), experiences of prior, existing or present traumatic experiences and the student’s ability to trust in the academic systems within which they are placed. The presence of traumatic emotional experiences in WP student populations connected to psychosocial and academic trust alienation theory contributes to understanding engagement barriers in HE. Using PRISMA guidelines, 43 publications were screened based on inclusion/exclusion criteria. Our review identified students’ experiences of trauma and how this had affected their HE educational engagement. It documented support strategies for student success and improvements in HEIs’ commitment to meeting WP agendas. This underlined the need for HEIs to commit to the social mobility agenda in a way which is aligned with barriers to student engagement. Current tracking and support systems may need to be augmented by tutorial systems and training for academic staff in relational tutorial systems, emphasising the presence of a consistent tutor. Jenkins (2020) suggests a single-session approach for addressing student needs within a short-term counselling model, but recognises this may not be suitable for students with more complex requirements. Thus, longer-term interventions and individualised counselling support approaches are arguably needed to support this demographic. 

To decrease barriers to student engagement we need to focus on psychological well-being and collaborative HEI strategies to improve recruitment, retention and ultimate success. Our systematic review argued that deeper understanding of the complexities of student needs should be embedded within HE teacher training programmes and curriculum delivery. Extending teaching skills to embed psychological understanding and practice delivery skills would not only work to meet Government targets but also raise aspirations: ‘ …with the right approach, the transmission of disadvantage from one generation to the next can be broken’ (Social Mobility Commission, 2017: 8).  Fulfilling the moral and corporate responsibility of HEIs to support the success of WP students might need new insights. Focusing on student engagement in HE with a better understanding  of psychological alienation theory, trauma and trust could be used by multiple HE audiences and across countries to improve practice and drive both political and educational change for the most disadvantaged individuals. It is time to view HE students from WP backgrounds as individuals, to respect their aspirational aims and value their experiences in a way that best suits their subjective requirements, so that they may progress and  succeed, helping to improve social mobility.

SRHE member Caroline S Jones is an applied social sciences professional with extensive experience in the children and young people field and HE programme leadership. She is a Tutor in the Education Faculty at Manchester Metropolitan University and was previously a Lecturer at the University Campus Oldham and at Stockport University Centre. Twitter: @caroline_JonesSFHEA. LinkedIn: https://www.linkedin.com/in/caroline-jones-1bab40b3/

SRHE member Zoe Nangah has been a Lecturer/Senior Lecturer in HE for 16 years across Psychology, Social Sciences, Counselling and Childhood Studies disciplines. She is currently a Senior Lecturer and Course Leader at the University of Chester for the MA Clinical Counselling course. Zoe is a qualified counsellor and supervisor and has conducted research into emotional experiences within student populations and explored perceptions of the support services. Twitter @zoenangah 

References

Fanti, KA, Konikou, K, Cohn, M, Popma, A and Brazil, IA (2020) ‘Amygdala functioning during threat acquisition and extinction differentiates antisocial subtypes’ Journal of Neuropsychology, Volume 14, Part 2. (June 2020) 226-241, British Psychological Society

Jenkins, P (2020) ‘Single session formulation : an alternative to the waiting list’ University and College Counselling Volume 8, issue 4, November 2020

Mann, SJ (2001) ‘Alternative Perspectives on the Student Experience: Alienation and Engagement’ Studies in Higher Education 26 (1): 7–19

Robinson, D and Salvestrini, V (2020) The Impact of Interventions for Widening Access to Higher Education London: Education Policy Institute: TASO

Social Mobility Commission (2017) State of the Nation 2017: Social Mobility in Great Britain London: Social Mobility Commission

Social Mobility Commission (2019) State of the Nation 2018-2019: Social Mobility in Great Britain London: Social Mobility Commission


1 Comment

In search of the perfect blend – debunking some myths

by Ling Xiao and Lucy Gill-Simmen

A blended learning approach has been adopted by most UK universities in response to the Covid pandemic. Institutions and higher education educators have become fully committed to and invested enormous resources in delivering such an approach. This has helped to maintain a high quality education and to some extent, to mitigate the potential dire impact of the pandemic on the students’ learning experience. This approach has no doubt accelerated a reshaping of pedagogic approaches, facilitating deep learning, autonomy, personalised learning and more. With the rapid pace of the UK’s vaccine rollout, and the semi promise by the UK Government that we’ll be back to some kind of normal by the end of June, there is hope for a possible return to campus in September, 2021.  As a result, this now marks a time when we need to reflect on what we have learned from the blended learning approach and figure out what to take forward in designing teaching and learning post-pandemic, be it hybrid or hyflex or both.

The Management Education Research Group (MERG) in the School of Business and Management at Royal Holloway, University of London recently held a symposium on ‘Reflecting on Blended Learning, What’s Next?’. It showcased blended learning examples from various universities across the UK and was followed by a panel discussion where we posed the question: what worked and what didn’t work? We found that some of our previous assumptions were brought into question and a number of myths were debunked.

Myth 1: Pre-Recorded videos should be formal and flawless

Overnight, and without any training, educators took on the role of film creators, film editors and videographers. Spending hours, days, weeks and even months developing lecture recordings from the confines of our home working spaces we were stopping, starting, re-starting, editing out the slightest imperfection. It has to be perfect, right? Not so fast.

For many institutions, recorded video is the primary delivery vehicle for blended learning content, with academics pre-recording short presentations, lectures and informal videos to complement text-based information and communication. Many of us postulated that a formal and meticulous delivery style for pre-recorded videos is required to help to maintain high quality educational materials and for students to perceive us as professionals. Academics’ personal experiences however suggest it is vital to keep the human elements, as students enjoy and engage better with a personalised delivery. A personalised style helps to build relationships with students which then provide foundations of learning. Mayer (2009) describes it as a personalisation principle of learning through video and recommends that a conversation style is preferential to a formal style for learning. This also resonates with recent insights from Harvard Business School Professor, Francesca Gino, who reflects in her webinar on the power of teaching with vulnerability in COVID-19. She explains the importance of being open, honest and transparent with students and sharing one’s own human side in order to strengthen the educator-learner bond.

Myth 2: Students enjoy learning from the comfort of their homes

Blended learning empowers students to become autonomous learners, since they can engage with their courses when real-time contact with lecturers is not possible. However, such autonomy isn’t all it’s cracked up to be, and turns out to be a lonely road for many students. Instead of relishing staying at home and learning when they want, some students declare they miss the structure, the sense of community and the feeling of belonging they associate with attending university in person.

Universities are more than places for learning, they serve as the centre of their communities for students. Students not only learn directly from the education but also, just as much, from interaction and collaboration with lecturers or their fellow classmates. It emerged in conversation between students and faculty that students felt it generally took longer to establish a sense of community in an online class than in a traditional face-to-face classroom but that it could be achieved. So, it’s up to us as educators to foster a sense of community amongst online learners.

Central to learning community is the concept of cooperative learning, which has been shown to promote productivity, expose students to interdisciplinary teams, foster idea generation, and to promote social interaction (Machemer, 2007). One such technique is to introduce collaborative learning opportunities, and those that reach beyond online group work and assessment – which in itself may prove problematic for learners. Instead, educators should look to develop co-creation projects such as wikis or blogs where students can come together to cocreate content together. Social annotation platforms such as Google Docs and Padlet enable students to share knowledge, develop the understanding of learning objects through collaborating on notes, commenting specific parts of materials, etc (Novak et al, 2012; Miller et al, 2018). Padlet for example has proved to be particularly popular with students for collaborative learning given its ease of use.

Myth 3 :  It makes sense to measure student engagement merely by participation metrics

After months of preparation and instructional design and preparing the perfect learning journey for students, we tend to expect students to learn and to engage in a way that we as educators perceive to be optimal for fulfilment of learning outcomes.

We all know that students learn and engage in many different ways, but we often find ourselves trawling the data and metrics to see whether students watched the videos, engaged in the readings we provided, posted on the fora we clearly labelled and participated in the mini quizzes and reflection exercises we created. However, as our hearts sink at what appears to be at times a relatively low uptake, we jump to the conclusion that students aren’t engaging. Here’s the thing:  they are, we just don’t see it. Engagement as a construct is something far more complex and multi-faceted which we can’t necessarily measure using the report logs on the VLE.

Student engagement is often labelled the “holy grail of learning” (Sinatra, Heddy and Lombardi, 2015: 1) since it correlates strongly with educational outcomes, including academic achievement and satisfaction. This can therefore lead to a level of frustration on the part of educators when engagement appears low. However, engagement comes in many forms, and in forms which are often not directly visible and/or measurable. For example, cognitive, behavioural and emotional engagement all have very different indicators which are not immediately apparent. Hence new ways of evaluating of student engagement in the blended learning environment are needed. Halverson and Graham (2019) propose a possible conceptual framework for engagement that includes cognitive and emotional indicators, offering examples of research measuring these engagement indicators in technology mediated learning contexts.

Myth 4: Technology is the make or break for blended learning

The more learning technologies we can add to our learning design, the better, right? Wrong. Some students declared the VLE has too much going on; they couldn’t keep up with all the apps and technologies they are required to work with to achieve their learning.

Although technology clearly plays a key role in the provision of education (Gibson, 2001; Watson, 2001; Gordon, 2014), it is widely acknowledged that technology should not determine but instead complement theories and practices of teaching. The onset of Covid-19 has shifted our focus to technology rather than pedagogy. For example, educators felt an immediate need for breakout room functionality: although this can be a significant function for discussion, this is not necessarily the case for disciplines such as accounting, which requires students continuously to apply techniques in order to excel at applied tasks. Pedagogy should determine technology. The chosen technology must serve a purpose and facilitate the aim of the pedagogy and should not be used as bells and whistles to make the learning environment appear more engaging. In our recent research, we provide empirical evidence for the effective pedagogical employment of Padlet to support learning and engagement (Gill-Simmen, 2021). Technology has an impact on pedagogy but should not be the driver in a blended or hybrid learning environment. Learning technologies are only applicable and of value when the right content is presented in right format and right time.

In summary, we learned these lessons for our future approach to hybrid learning:

  1. Aim for ‘human’ not perfection in instructional design
  2. Students don’t want to learn alone – create opportunities for collaborative learning
  3. Student engagement may not always be measurable – consider tools for assessing emotional, cognitive and behavioural engagement
  4. Technology should support pedagogy, not vice versa – implement only those technologies which facilitate student learning

SRHE member Dr Ling Xiao is the Director of MERG and a Senior Lecturer in Financial Management at Royal Holloway, University of London.  Follow Ling via @DrLingXiao on Twitter.

SRHE member Dr Lucy Gill-Simmen is a Senior Lecturer in Marketing at Royal Holloway, University of London and Program Director for Kaplan, Singapore. Follow Lucy via @lgsimmen on Twitter.

References

Gill-Simmen, L. (2021). Using ‘Padlet’ in Instructional Design to Promote Cognitive Engagement: A Case Study of UG Marketing Students, (In Press) Journal of Learning Development in Higher Education.

Machemer, P.L. (2007). Student perceptions of active learning in a large cross-disciplinary classroom, Active Learning in Higher Education, 8(1): 9-29.

Mayer, R. E. (2009). Multimedia learning Cambridge, England: Cambridge University Press (2nd edn).


Leave a comment

An SRHE playlist

by Leo Goedegebuure

There are many ways of communicating. Text always has been our main medium, but the last year has clearly shown that there are other ways. One of the most popular articles in the recent special issue of Studies in Higher Education on the impact of the pandemic was Amy Metcalfe’s photo-based essay. We had a massive SRHE webinar on the contributions, with a truly global audience. Taking David Bowie’s Sound and Vision to the extreme, we have done the Vision but we haven’t done the Sound.

2021 will be another special year. By the end of the year we will still not be able to come together face-to-face at the 2021 SRHE conference, although an exciting alternative kind of conference is being planned. It will be good to have a decent soundtrack for the event. So we thought we might kick this off with a bit of advanced planning – and activity. Last minute work can be a bit tedious and stressful. So we propose a two-pronged approach to this. We’ll start by inviting this year’s contributors to the Studies in Higher Education special issue to submit their 5-song playlist in addition to their accepted and on-line published article. And we invite all the readers of this blog to do the same. What we expect as outcome of this fun and silly project is a reflection of the diversity of our community in music.

So let me kick this off. The basic model is: song and a brief one-sentence reason why, plus Spotify link.  Here we go:

1 Amy Macdonald – Let’s Start a Band                                   

The obvious opener for a project like this

2 David Bowie – Life on Mars                                                     

The amazing achievement of the Mars Perseverance Rover so far and a tribute to one who left too early

3 Bruce Springsteen – The Ghost of Tom Joad                     

Too many ghosts of 2020 and a brilliant contribution of Tom Morello

4 REM – Nightswimming                                                              

Quietly avoiding restrictions without creating chaos and such a great song

5 Vreemde Kostgangers – Touwtje uit de Deur                   

My Dutch heritage; the literal translation is “A Rope from the Letterbox” reflecting on a time when you could just pull a little rope to enter your neighbour’s house

Amy Metcalfe has also skipped in already with her suggestions, which have been included in the playlist:

1 Snow Patrol – Life on Earth

“This is something else.”

2 Foster the People – Imagination

“We can’t change the things we can’t control.”

3 Haelos – Hold On

“Hold on.”

4 The Weeknd – Blinding Lights

I’ve been on my own for long enough.”

5 Lastlings – Out of Touch

“Don’t want this to fall apart; Is this what you really need?”

There will be more to follow from contributors to the SHE special issue, but everyone is invited to send in their own 5-track playlist to rob.cuthbert@uwe.ac.uk and leo.g@unimelb.edu.au. We will provide updates via the blog at regular intervals, and aim to compile a comprehensive playlist later in the year – which may or may not become lockdown listening, depending on where in the world you are and how your country is faring in the pandemic. We hope you enjoy it.

Leo Geodegebuure is Editor-in-Chief, Studies in Higher Education, Professorial Fellow in the Melbourne Centre for the Study of Higher Education and the Melbourne Sustainable Society Institute, and Honorary Professor in the School of Global, Urban and Social Studies, College of Design and Social Context, RMIT University.


1 Comment

The Social Mobility Index (SMI): A welcome and invitation to debate from the Exeter Centre for Social Mobility

by Anna Mountford-Zimdars and Pallavi Banerjee

There is a new English league table on the block! Welcome! The exciting focus of this new ranking concerns social mobility – the clue is in the name and it is called the Social Mobility Index (SMI). Focusing on social mobility differentiates the SMI from other league tables, which often include dimensions such as prestige, research income, staff qualifications, student satisfaction, and employment outcomes.

The SMI is specifically about an institution’s contribution to supporting disadvantaged learners. It uses the OfS model of access to, progression within and outcomes after higher education. Leaning on a methodology developed for a SMI in the US, the English version contains three dimensions: (1) Access, drawing on the Index of multiple deprivation (IMD); (2) Continuation, using progression data into the second year drawing on IMD; and (3) Salaries (adjusted for local purchasing power), using Longitudinal Education Outcomes (LEO) salary data collected one year after graduation.

The SMI report thoughtfully details the rationale for the measures used and is humble in acknowledging that other measures might be developed that are more useful. But do the reflections of the authors go far enough? Let’s take the graduate outcome LEO data for example. These capture salaries 15 months into employment – too early for an outcome measure. It is also not broken down by IMD, there are heaps of missing data in LEO and those who continue into further study are not captured. Low IMD students may or may not be earning the same sort of salaries as their more advantaged peers. The regional weightings seem insufficient in light of the dominance of high-salary regions of both the US and English SMI. These shortcomings make the measure a highly problematic one to use, though the authors are right to endeavour to capture some outcome of higher education.  

We would like a bolder SMI. Social Mobility is not only about income but about opportunities and choice and about enabling meaningful contribution in society. This was recognised in Bowen and Bok’s (2000) evaluation of affirmative action, which measured ‘impact’ not only as income but as civic contribution, health, well-being.  Armstrong and Hamilton (2015) show the importance of friendship and marriage formation as a result of shared higher education experiences. The global pandemic has shown that the most useful jobs we rely on such as early years educators are disgracefully underpaid. The present SMI’s reduction of ‘success’ to a poor measure of economic outcomes needs redressing in light of how far the academic debate has advanced.

Also, social mobility is about more than class, it is about equal opportunities for first generation students, disabled students, men and women, refugees, asylum seekers, global majority ethnic groups as well as local, regional, national and international contributions. It is also about thinking not only about undergraduate student access, progress and success but about postgraduates, staff and the research and teaching at universities.

A really surprising absence in the introduction of this new SMI is reference to the Times Higher Education Impact Rankings. These are the only global performance tables that assess universities against the United Nations’ Sustainable Development Goals. First published in 2019, this ranking includes a domain on reducing inequality. The metrics used by the Times Higher ranking are: Research on reducing inequalities (27%); First-generation students (23.1%); Students from developing countries (15.4%); Students and staff with disabilities (11.4%); and Institutional measures against discrimination – including outreach and admission of disadvantaged groups (23.1%). The THE ranking celebrates that institutions also contribute to social mobility through what they research and teach. This dimensions should be borrowed for an English SMI in light of the importance attached to research-led, research-informed and research-evidenced practices in the higher education sector.

The use of individual measures in the THE ranking, of those with parents without a background of higher education (first generation students) and those with disabilities, including staff, has merit.  Yes, individual-level measures are often challenging to ‘operationalise’. But this shouldn’t prevent us from arguing that they are the right measures to aspire to using. However the use of first generation students also highlights that the debate in the UK, focusing on area-level disadvantages such as the IMD or POLAR, is different from the international framing of first generation students measuring the educational background of students.

The inclusion of staff in the THE ranking is an interesting domain that merits consideration. For example, data on, for example, the gender pay gap is easily obtainable in England and it would indicate something about the organisational culture. Athena Swan awards or the Race Equality Charter or other similar awards which are an indicator of the diversity and equality in an institution could be considered as organisational commitments to the agenda and are straight-forward to operationalise.

We warmly welcome the SMI and congratulate Professor David Phoenix for putting the debate centre-stage and note that his briefing is already stimulating debate with Professor Peter Scott’s thoughtful contribution to the debate.  It is important to think about social mobility and added value as part of the league table world. It is in the nature of league tables that they oversimplify the work done by universities and staff and the achievements of their students.

There is real potential in the idea of an SMI and we hope that our contribution to the debate will bring some of these dimensions into the public debate of how we construct the index. This will create a SMI that celebrates good practice by institutions in the space of social mobility and encourages more good practice that will ultimately make higher education more inclusive and diverse while supporting success for all.

SRHE member Anna Mountford-Zimdars is Professor of Social Mobility and Academic Director of the Centre for Social Mobility at the University of Exeter. Pallavi Amitava Banerjee is a Fellow of the Higher Education Academy. She is an SRHE member and Senior Lecturer in Education in the Graduate School of Education at the University of Exeter.

Image of Rob Cuthbert


1 Comment

Some different lessons to learn from the 2020 exams fiasco

by Rob Cuthbert

The problems with the algorithm used for school examinations in 2020 have been exhaustively analysed, before, during and after the event. The Royal Statistical Society (RSS) called for a review, after its warnings and offers of help in 2020 had been ignored or dismissed. Now the Office for Statistics Regulation (OSR) has produced a detailed review of the problems, Learning lessons from the approach to developing models for awarding grades in the UK in 2020. But the OSR report only tells part of the story; there are larger lessons to learn.

The OSR report properly addresses its limited terms of reference in a diplomatic and restrained way. It is far from an absolution – even in its own terms it is at times politely damning – but in any case it is not a comprehensive review of the lessons which should be learned, it is a review of the lessons for statisticians to learn about how other people use statistics. Statistical models are tools, not substitutes for competent management, administration and governance. The report makes many valid points about how the statistical tools were used, and how their use could have been improved, but the key issue is the meta-perspective in which no-one was addressing the big picture sufficiently. An obsession with consistency of ‘standards’ obscured the need to consider the wider human and political implications of the approach. In particular, it is bewildering that no-one in the hierarchy of control was paying sufficient attention to two key differences. First, national ‘standardisation’ or moderation had been replaced by a system which pitted individual students against their classmates, subject by subject and school by school. Second, 2020 students were condemned to live within the bounds not of the nation’s, but their school’s, historical achievements. The problem was not statistical nor anything to do with the algorithm, the problem was with the way the problem itself had been framed – as many commentators pointed out from an early stage. The OSR report (at 3.4.1.1) said:

“In our view there was strong collaboration between the qualification regulators and ministers at the start of the process. It is less clear to us whether there was sufficient engagement with the policy officials to ensure that they fully understood the limitations, impacts, risks and potential unintended consequences of the use of the models prior to results being published. In addition, we believe that, the qualification regulators could have made greater use of  opportunities for independent challenge to the overall approach to ensure it met the need and this may have helped secure public confidence.”

To put it another way: the initial announcement by the Secretary of State was reasonable and welcome. When Ofqual proposed that ranking students and tying each school’s results to its past record was the only way to do what the SoS wanted, no-one in authority was willing either to change the approach, or to make the implications sufficiently transparent for the public to lose confidence at the start, in time for government and Ofqual to change their approach.

The OSR report repeatedly emphasises that the key problem was a lack of public confidence, concluding that:

“… the fact that the differing approaches led to the same overall outcome in the four countries implies to us that there were inherent challenges in the task; and these 5 challenges meant that it would have been very difficult to deliver exam grades in a way that commanded complete public confidence in the summer of 2020 …”

“Very difficult”, but, as Select Committee chair Robert Halfon said in November 2020, things could have been much better:

“the “fallout and unfairness” from the cancellation of exams will “have an ongoing impact on the lives of thousands of families”. … But such harm could have been avoided had Ofqual not buried its head in the sand and ignored repeated warnings, including from our Committee, about the flaws in the system for awarding grades.”

As the 2021 assessment cycle comes closer, attention has shifted to this year’s approach to grading, when once again exams will not feature except as a partial and optional extra. When the interim Head of Ofqual, Dame Glynis Stacey, appeared before the Education Select Committee, Schools Week drew some lessons which remain pertinent, but there is more to say. An analysis of 2021 by George Constantinides, a professor of digital computation at Imperial College whose 2020 observations were forensically accurate, has been widely circulated and equally widely endorsed. He concluded in his 26 February 2021 blog that:

“the initial proposals were complex and ill-defined … The announcements this week from the Secretary of State and Ofqual have not helped allay my fears. … Overall, I am concerned that the proposed process is complex and ill-defined. There is scope to produce considerable workload for the education sector while still delivering a lack of comparability between centres/schools.”

The DfE statement on 25 February kicks most of the trickiest problems down the road, and into the hands of examination boards, schools and teachers:

“Exam boards will publish requirements for schools’ and colleges’ quality assurance processes. … The head teacher or principal will submit a declaration to the exam board confirming they have met the requirements for quality assurance. … exam boards will decide whether the grades determined by the centre following quality assurance are a reasonable exercise of academic judgement of the students’ demonstrated performance. …”

Remember in this context that Ofqual acknowledges “it is possible for two examiners to give different but appropriate marks to the same answer”. Independent analyst Dennis Sherwood and others have argued for alternative approaches which would be more reliable, but there is no sign of change.

Two scenarios suggest themselves. In one, where this year’s results are indeed pegged to the history of previous years, school by school, we face the prospect of overwhelming numbers of student appeals, almost all of which will fail, leading no doubt to another failure of public confidence in the system. The OSR report (3.4.2.3) notes that:

“Ofqual told us that allowing appeals on the basis of the standardisation model would have been inconsistent with government policy which directed them to “develop such an appeal process, focused on whether the process used the right data and was correctly applied”.

Government policy for 2021 seems not to be significantly different:

Exam boards will not re-mark the student’s evidence or give an alternative grade. Grades would only be changed by the board if they are not satisfied with the outcome of an investigation or malpractice is found. … If the exam board finds the grade is not reasonable, they will determine the alternative grade and inform the centre. … Appeals are not likely to lead to adjustments in grades where the original grade is a reasonable exercise of academic judgement supported by the evidence. Grades can go up or down as the result of an appeal.” (emphasis added)

There is one crucial exception: in 2021 every individual student can appeal. Government no doubt hopes that this year the blame will all be heaped on teachers, schools and exam boards.

The second scenario seems more likely and is already widely expected, with grade inflation outstripping the 2020 outcome. There will be a check, says DfE, “if a school or college’s results are out of line with expectations based on past performance”, but it seems doubtful whether that will be enough to hold the line. The 2021 approach was only published long after schools had supplied predicted A-level grades to UCAS for university admission. Until now there has been a stable relationship between predicted grades and examination outcomes, as Mark Corver and others have shown. Predictions exceed actual grades awarded by consistent margins; this year it will be tempting for schools simply to replicate their predictions in the grades they award. Indeed, it might be difficult for schools not to do so, without leaving their assessments subject to appeal. In the circumstances, the comments of interim Ofqual chief Simon Lebus that he does not expect “huge amounts” of grade inflation seem optimistic. But it might be prejudicial to call this ‘grade inflation’, with its pejorative overtones. Perhaps it would be better to regard predicted grades as indicators of what each student could be expected to achieve at something close to their best – which is in effect what UCAS asks for – rather than when participating in a flawed exam process. Universities are taking a pragmatic view of possible intake numbers for 2021 entry, with Cambridge having already introduced a clause seeking to deny some qualified applicants entry in 2021 if demand exceeds the number of places available.

The OSR report says that Ofqual and the DfE:

“… should have placed greater weight on explaining the limitations of the approach. … In our view, the qualification regulators had due regard for the level of quality that would be required. However, the public acceptability of large changes from centre assessed grades was not tested, and there were no quality criteria around the scale of these changes being different in different groups.” (3.3.3.1)

The lesson needs to be applied this year, but there is more to say. It is surprising that there was apparently such widespread lack of knowledge among teachers about the grading method in 2020 when there is a strong professional obligation to pay attention to assessment methods and how they work in practice. Warnings were sounded, but these rarely broke through to dominate teachers’ understanding, despite the best efforts of education journalists such as Laura McInerney, and teachers were deliberately excluded from discussions about the development of the algorithm-based method. The OSR report (3.4.2.2) said:

“… there were clear constraints in the grade awarding scenario around involvement of service delivery staff in quality assurance, or making the decisions based on results from a model. … However, we consider that involvement of staff from centres may have improved public confidence in the outputs.”

There were of course dire warnings in 2020 to parents, teachers and schools about the perils of even discussing the method, which undoubtedly inhibited debate, but even before then exam processes were not well understood:

“… notwithstanding the very extensive work to raise awareness, there is general limited understanding amongst students and parents about the sources of variability in examination grades in a normal year and the processes used to reduce them.” (3.2.2.2)

My HEPI blog just before A-level results day was aimed at students and parents, but it was read by many thousands of teachers, and anecdotal evidence from the many comments I received suggests it was seen by many teachers as a significant reinterpretation of the process they had been working on. One teacher said to Huy Duong, who had become a prominent commentator on the 2020 process: “I didn’t believe the stuff you were sending us, I thought it [the algorithm] was going to work”.

Nevertheless the mechanics of the algorithm were well understood by many school leaders. FFT Education Datalab was analysing likely outcomes as early as June 2020, and reported that many hundreds of schools had engaged them to assess their provisional grade submissions, some returning with a revised set of proposed grades for further analysis. Schools were seduced, or reduced, to trying to game the system, feeling they could not change the terrifying and ultimately ridiculous prospect of putting all their many large cohorts of students in strict rank order, subject by subject. Ofqual were victims of groupthink; too many people who should have known better simply let the fiasco unfold. Politicians and Ofqual were obsessed with preventing grade inflation, but – as was widely argued, long in advance –  public confidence depended on broader concerns about the integrity and fairness of the outcomes.

In 2021 we run the same risk of loss of public confidence. If that transpires, the government is positioned to blame teacher assessments and probably reinforce a return to examinations in their previous form, despite their known shortcomings. The consequences of two anomalous years of grading in 2020 and 2021 are still to unfold, but there is an opportunity, if not an obligation, for teachers and schools to develop an alternative narrative.

At GCSE level, schools and colleges might learn from emergency adjustments to their post-16 decisions that there could be better ways to decide on progression beyond GCSE. For A-level/BTEC/IB decisions, schools should no longer be forced to apologise for ‘overpredicting’ A-level grades, which might even become a fairer and more reliable guide to true potential for all students. Research evidence suggests that “Bright students from poorer backgrounds are more likely than their wealthier peers to be given predicted A-level grades lower than they actually achieve”. Such disadvantage might diminish or disappear if teacher assessments became the dominant public element of grading; at present too many students suffer the sometimes capricious outcomes of final examinations.

Teachers’ A-level predictions are already themselves moderated and signed off by school and college heads, in ways which must to some extent resemble the 2021 grading arrangements. There will be at least a two-year discontinuity in qualification levels, so universities might also learn new ways of dealing with what might become a permanently enhanced set of differently qualified applicants. In the longer term HE entrants might come to have different abilities and needs, because of their different formation at school. Less emphasis on preparation for examinations might even allow more scope for broader learning.

A different narrative could start with an alternative account of this year’s grades – not ‘standards are slipping’ or ‘this is a lost generation’, but ‘grades can now truly reflect the potential of our students, without the vagaries of flawed public examinations’. That might amount to a permanent reset of our expectations, and the expectations of our students. Not all countries rely on final examinations to assess eligibility to progress to the next stage of education or employment. By not wasting the current crisis we might even be able to develop a more socially just alternative which overcomes some of our besetting problems of socioeconomic and racial disadvantage.

Rob Cuthbert is an independent academic consultant, editor of SRHE News and Blog and emeritus professor of higher education management. He is a Fellow of the Academy of Social Sciences and of SRHE. His previous roles include deputy vice-chancellor at the University of the West of England, editor of Higher Education Review, Chair of the Society for Research into Higher Education, and government policy adviser and consultant in the UK/Europe, North America, Africa, and China.


Leave a comment

Let them eat data: education, widening participation and the digital divide

by Alex Blower and Nik Marsdin

The quest for an answer

As an education sector we like answers, answers for everything, right or wrong. Sometimes we’re more concerned with arriving at an answer, than we are with ensuring it tackles the issue addressed by the question.

Widening HE participation is led by policy that dictates which answers we provide to what questions and to whom. All too often this leads to practitioners scrambling for answers to questions which are ill fitting to the issue at hand, or looking for a quick solution in such haste that we forget to read the question properly.

The COVID-19 pandemic has once again laid bare the stark inequality faced by children and young people in our education system. With it has been an influx of new questions from policy makers, and answers from across the political and educational spectrum.

A magic ‘thing’

More often than not, answers to these questions will comprise a ‘thing’. Governments like tangible objects like mentoring, tutoring, longer days, boot camps and shiny new academies. All of which align to the good old fashioned ‘fake it till you make it’ meritocratic ideal. For the last 40 years the Government has shied away from recognising, let alone addressing, embedded structural inequality from birth. It’s difficult, it’s complicated, and it can’t readily be answered in a tweet or a soundbite from a 6pm press conference.

The undesirable implications of a search for an ‘oven ready’ answer can be seen in the digital divide. A stark example of what access to the internet means for the haves and have-nots of the technological age.

‘So, the reason young people are experiencing extreme inequality and not becoming educationally successful, is because they don’t have enough access to technological things?’

‘What we need is a nice solid technological thing we can pin our hopes on…’

‘Laptops for everyone!’

Well, (and I suspect some voices in the back know what’s coming) access to technology alone isn’t the answer, in the same way that a pencil isn’t the answer to teaching a child to write.

Technology is a thing, a conduit, a piece of equipment that, if used right, can facilitate a learning gain. As professionals working to widen HE participation, we need to challenge these ‘oven ready answers’. Especially if they seem misguided or, dare I say it woefully ignorant of the challenges working-class communities face.

After distribution of the devices, online engagement didn’t change

Lancaster University developed the ‘Connecting Kids’ project during the first wave of COVID-19, as a direct response to calls for help by local secondary schools. The project achieved what it set out to in that it procured over 500 brand new laptops or Chromebooks, and free internet access for all recipients. Every child who fell outside of the Department for Education scheme who was without a suitable device in the home would now have one. Problem solved, right?

Not quite. Engagement in online learning environments prior to the DfE scheme and Connecting Kids initiative in years 8 and 9 was hovering at about 30% of students engaging daily, and 45% weekly. After the distribution of devices, engagement remained at nearly exactly the same level. Further inspection of the data from the telecom’s provider demonstrated that of the 500 mobile connections distributed, only 123 had been activated. Of those 123 only half were being regularly used. Of the 377 ‘unused’ sim and mi-fi packages around 200 showed ‘user error’ in connection status.

Again, this may come as no surprise to the seasoned professionals working with children and young people at the sharp end of structural inequality, but it turned out the ‘thing’ wasn’t the answer. Who would have thought it?

Understanding communities and providing resources

Fast forward 6 months and monthly interviews with participating school staff (part of the project evaluation, not yet complete) show that online engagement in one school is up to 92%. The laptops have played a valuable role in that. They have enabled access. What they haven’t done however, is understand and make allowances for the circumstances of children, young people and families. That has taken a commitment by the schools to provide holistic wrap around services in partnership with other organisations. It has included short courses on connecting to the internet, and provision of basic learning equipment such as pencils, paper, and pens. It has included the school day and timetable being replicated online, live feedback sessions with teachers and learning assistants, and drop-in sessions for parents and carers. Most importantly, it has included a recognition of the difference between home and school, and the impact it has on the education working-class of young people.

Back to policy and widening participation. If we are to make our work truly meaningful for young people, we must critically engage with a policy narrative which is built around a desire for quick fixes, soundbites and ‘oven ready things’. We owe it to the young people who are being hit hardest by this pandemic to take a step back and look at the wider barriers they face.

To do this we may need to reconceptualise what it means to support them into higher education. This starts with challenging much of the policy that is designed to improve access to higher education built upon a premise of individual deficit. The repetitive waving of magical policy wands to conjure up laptops, mentors and days out on campus will only serve to leave us with ever increasing numbers of students and families who are left out and disengaged. Numbers that will continue to rise unless we take the time to engage critically with the complex, numerous and damaging inequalities that working-class young people face.

Reshaping university outreach

This leaves us with something of a conundrum. As HE professionals, what on earth can we do about all of that? Is it our place to address an issue so vast, and so intimately tied to the turning cogs of government policy and societal inequality?

Well, if recent conversations pertaining to higher education’s civic purpose are anything to go by, the answer is undoubtedly yes. And we need to do it better. Within our mad scramble to do something to support young learners during the first, second, and now third national lockdown, our ‘thing’ has become online workshops.

For many of us the ramifications of the digital divide have been acknowledged, but we have shied away from them in work to widen HE participation. We’ve kept doing what we’ve always done, but switched to a model of online delivery which restricts who has the ability to access the content. Can we honestly say, given the disparity in digital participation amongst the most and least affluent groups, that this is the right answer to the question?

Rather than an online workshop series on ‘choosing universities’, would our time and resource be better spent by organising student ambassadors from computing subjects to staff a freephone helpline supporting young people in the community to get online? Could we distribute workbooks with local newspapers? Could we, as they did at Lancaster, work in partnership with other local and national organisations to offer more holistic support, support which ensures that as many students are able to participate in education digitally as possible?

For us, the answer is yes. Yes we should. And we can start by meaningfully engaging with the communities our universities serve. By taking the time to properly listen and understand the questions before working with those communities to provide an answer.

Currently based at the University of Portsmouth, Dr Alex Blower has worked as a professional in widening access to Higher Education for the last decade. Having completed his doctoral research in education and inequality last year, Alex’s research interests focus around class, masculinity and higher education participation. Follow Alex via @EduDetective on twitter.

Nik Marsdin is currently lead for the Morecambe Bay Curriculum (part of the Eden North Project) at Lancaster University. Nik worked in children’s social care, youth justice and community provision for 12 years prior to moving into HE.  Research interests are widening participation, school exclusion, transitions in education and alternative provision. Follow Nik via @MarsdinNik on Twitter.


Leave a comment

Blue-skies thinking

by Paul Temple

A few years ago, a recently-retired Permanent Secretary talked to our MBA group at the Institute of Education, on a Chatham House rule basis, about policy-making in government. One of his remarks which stayed with me was about the increased speed of policy change during his professional lifetime. The key word here was “change” – as an end in itself. A newly-appointed Secretary of State, he explained, after a week or so in the job, would be invited to pop in to Number 10 for a cup of tea. “How’s it going, then?” he or she would be asked. If the answer was, “Oh, fine, thanks, everything seems to be running smoothly”, then they were toast. The correct answer was, “Well, I expected a few problems in taking over from X, but, really, I was shocked to discover how bad things are. But I’ve got a grip on it, and I’ll be making big changes.” Status around the Cabinet table depended on the boldness and scope of the policy changes your Department was pursuing. Effectiveness was a secondary matter.

The March 2020 budget included the commitment for the Government to “invest at least £800m” in a “blue-skies” funding agency, to support “high risk, high reward science”[1]. This seems to be the one possibly lasting legacy of Dominic Cummings’ reign in Downing Street: as I noted in my blog here on 6 February 2020, one of his stated goals was to create a UK version of the US Defence Advanced Research Projects Agency (DARPA), famous for initiating the internet. The House of Commons Science and Technology Committee reported on the Government’s plans on 12 February 2021[2], expressing puzzlement about the lack of detail on the proposed Agency’s remit since the proposal was unveiled in the December 2019 Queen’s Speech: “a brand in search of a product” was the Committee’s acid summing-up of the position. (Perhaps Cummings is being missed more than was predicted.) The Committee recommended that the “Haldane principle should not apply to how UK ARPA’s overall focus is determined. Ministers should play a role in shaping ARPA’s initial focus” but after that, it should be able “to pursue ‘novel and contentious’ research without case-by-case Ministerial approval” (p45). Which Minister(s) will have this focus-shaping responsibility is not yet clear.

The Committee obviously struggled to see what precisely an ARPA could do that UKRI, with perhaps some amended terms of reference, could not do. But of course the big difference is that an ARPA will be change – a shiny new initiative – and so much better for the Minister involved than tinkering with existing bits of governmental machinery. I expect they’ll find a way to launch the ARPA involving the Minister standing next to some fancy scientific kit wearing a hi-vis jacket and a hard hat.

As David Edgerton has pointed out[3], the so-called Haldane principle – that government should decide on overall research funding but that decisions on individual projects should be made by researchers – was never actually formulated by Haldane himself (Viscount Haldane, 1856-1928) and has a somewhat chequered history in science policy. Nevertheless, for much of the twentieth century, what was considered to be the Haldane principle underpinned the funding of UK research, with the idea of academic freedom so central to research funding that, as Edgerton says, it was “a principle that didn’t need to be written down”. That was then.

This began to change with the 1971 report by Lord Rothschild on The Organisation and Management of Government R&D[4], which, controversially, introduced the client/contractor relationship into public funding of research. This began the long and winding journey, via the Research Assessment Exercises, starting in in 1986, which led to the “impact statements” of the 2014 Research Excellence Framework in order to demonstrate proposals’ value for money. As Susan Greenfield once remarked[5], this was like saying that you’re only going to back winning horses.

Lyn Grove, whose PhD research[6] cast a fascinating light on why and how researchers approached their topics, quoted one of her respondents as saying, “the main thing is that you should try to do research that answers a question that is troubling you, even if it’s not yet troubling the rest of the world”: a pretty good summary of what blue-skies research should do. Is the ARPA blue-skies proposal going to take us, at least in part, back to a lost world, where researchers could pursue troubling ideas without considering their possible “impact” and where failure was accepted as an unavoidable aspect of research work? Has research policy, almost inadvertently, really run full-circle, driven by the incessant demand for novelty in policy-making? In the context of increasingly intrusive interventions by government into everyday university life (the idea of a university “woke warden”[7] would until recently have been a good joke), it somehow seems implausible. But we can always hope.

SRHE member Paul Temple is Honorary Associate Professor, Centre for Higher Education Studies, UCL Institute of Education, University College London. See his latest paper ‘University spaces: Creating cité and place’, London Review of Education, 17 (2): 223–235 at https://doi.org/10.18546


[1] House of Commons Science and Technology Committee website, visited 13 February 2021

[2] https://publications.parliament.uk/pa/cm5801/cmselect/cmsctech/778/77803.htm

[3] Research Fortnight 12 December 2018

[4] Published with other material as HMSO (1971) A Framework for Government Research and Development Cmnd 4814. London: HMSO

[5] Greenfield, S (2011) ‘Research – the current situation and the next steps’ in The future of research in the UK – value, funding and the practicalities of rebalancing the UK economy London: Westminster Education Forum

[6] Grove, L (2017) The effects of funding policies on academic research Unpublished PhD thesis London: UCL Institute of Education

[7] briefing@wonkhe.com, 15 February 2021

Image of Rob Cuthbert


Leave a comment

SRHE News on teaching and learning

By Rob Cuthbert

One of the benefits of SRHE membership is exclusive access to the quarterly newsletter, SRHE News, archived at https://www.srhe.ac.uk/publications/. SRHE News typically contains a round-up of recent academic events and conferences, policy developments and new publications, written by editor Rob Cuthbert. To illustrate the contents, here is part of the January 2021 issue which covers Teaching and Learning.

Academic development and Pro VC roles can go together

Fiona Denney (Brunel) reported her research in International Journal of Academic Development (online 13 December 2020) based on interviews with four Pro VCs with academic development backgrounds: “Over the past two years, four research-intensive universities in the UK have appointed senior academic leaders from academic development backgrounds, a new phenomenon in this sector of UK higher education that may suggest a changing pattern. This study interviewed these four leaders to explore what the appointment means for their academic identity. The interviewees identified internal and external drivers for change and noted their backgrounds as academic developers made their routes into these senior roles different from their peers. For this reason, their ‘academic credibility’ was critical in order to implement culture change effectively.”

How metrics are changing academic development

Roni Bamber (Queen Margaret University) blogged for Wonkhe on 18 December 2020 about her monograph for SEDA, Our days are numbered. Great title, good read.

SoTL in action

The 2018 book edited by Nancy Chick was reviewed by Maik Arnold (University of Applied Sciences, Germany) for Arts and Humanities in Education (online 12 October 2020).

Innovations in Active Learning in Higher Education

The new book by SRHE members Simon Pratt-Adams, Uwe Richter and Mark Warnes (all Anglia Ruskin) grew out of an Active Learning conference at Anglia Ruskin University, leading to a book which, in the words of the foreword by Mike Sharples (Open University) “shows how to put active learning into practice with large cohorts of students and how to grow that practice over many years. The authors come from a variety of institutions and discipline areas … What they have in common is a desire to improve student engagement, experience and outcomes, through active learning approaches that work in practice and are scalable and sustainable.” Free to download from the publishers, Fulcrum.

Now that’s what I call a publishing event

The new book by Keith Trigwell (Sydney) and Mike Prosser (Melbourne) Exploring University Teaching and Learning: Experience and Context, was launched on 10 December 2020, more than 20 years since Understanding Learning and Teaching appeared in 1999. The book focuses on university teachers’ experience of teaching and learning, discussing the qualitative variation in approaches to university teaching, the factors associated with that variation, and how different ways of teaching are related to differences in student experiences of teaching and learning. The authors extend the discussions of teaching into new areas, including emotions in teaching, leadership of teaching, growth as a university teacher and the contentious field of relations between teaching and research.

Psychological contract profiling for managing the learning experience of higher education students

László Horváth (ELTE Eötvös Loránd University, Budapest) used a service marketing approach for his article in the European Journal of Higher Education (online 27 January 2020): “Combining … six factors for expectations (personalization, development of soft skills, competent teachers, labour market preparedness, support, flexibility) and three factors of obligations (performance and activity, preciseness and punctuality, obedience and respect), we created Psychological Contract Profile Clusters (outcome-centred, teacher-centred, learner-centred, learning-centred, content-centred and self-centred students).”

“Grade inflation remains ‘a significant and pressing issue’”

That was how the OfS chose to present its analysis of degree outcomes published on 19 November 2020, quoting OfS chief executive Nicola Dandridge. The report itself said the rate of increase in ‘grade inflation’ had slowed in 2018-2019, and buried in the text was this: “It is not possible to deduce from this analysis what factors not included in the modelling (such as improvements in teaching quality, more diligent students or changes to assessment approaches) are driving the observed changes in degree attainment.” No recognition by OfS of the research by Calvin Jephcote (Leicester), SRHE members Emma Medland and Robin Lygo-Baker (both Surrey) published in Assessment and Evaluation in Higher Education, which concluded: “The results suggest a much more positive and proactive picture of a higher education system that is engaged in a process of continuous enhancement. The unexplained variables, rather than automatically being labelled as grade inflation, should instead point to a need to investigate further the local institutional contextual factors that inform grade distribution. The deficit lens through which ‘grade inflation’ is often perceived is a damaging and unhelpful distraction.” Perhaps Nicola Dandridge was auditioning for Queen of Hearts in the OfS Christmas panto: “Sentence first, verdict afterwards”.

Jephcote, Medland and Lygo-Baker had also blogged for Wonkhe on 14 October 2020 about their research: “Evidence for why grades are trending upwards, or the less loaded phrase of grade improvement, reveal a complex landscape. According to our recent research, the most influential determinants of grade improvement were shown to be the geographic location of an institution, research output quality and the increasing quality of student cohorts – although even this variable was determined on grade entry points, which the recent A Level debacle in the UK has pulled into question. … What this evidence reveals is that a combination of student aptitude, and changes to the structure and quality of UK higher education, appear to be largely accountable for graduates attaining higher grades. It also, importantly, points to the problems associated with our criterion-referenced approaches to assessment being critiqued using a norm-referenced rationale.”

Peer review of teaching in Australian HE: a systematic review

The article by Alexandra L Johnston, Chi Baik and Andrea Chester (all Melbourne) was in Higher Education Research and Development (online 18 November 2020) “A thematic synthesis revealed teaching development outcomes gained through peer review of teaching span factors at organisational … program … and individual … levels. Organisational factors included disciplinary context, program sustainability, collegiality and leadership. Program factors included framework, program design, basis of participation, observation, feedback and reflective practice. Factors at the individual level included prior experience and participants’ perceived development requirements.”

What do undergraduate students understand by excellent teaching?

SRHE member Mike Mimirinis (West London) published the results of his SRHE-funded research in Higher Education Research and Development (online 21 November 2020): “This article explores undergraduate students’ conceptions of what constitutes excellent teaching. … semi-structured interviews with students at two English universities yields five qualitatively different conceptions of excellent teaching. In contrast to the current intense policy focus on outcome factors (eg graduate employability), students predominantly discern process factors as conducive to excellent teaching: how the subject matter is presented, what the lecturer brings to the teaching process, how students’ personal understanding is supported, and to what extent the questioning and transformation of disciplinary knowledge is facilitated. More importantly, this study demonstrates that an expansion of students’ awareness of the nature of teaching is internally related to the expansion of their awareness of the nature of disciplinary knowledge.”

The German sense of humour

The article in Studies in Higher Education (online 3 June 2019, issue 2020:12) was based on two large surveys of how teachers used humour in their teaching, and how students responded. It seems to come down to what the teachers meant by using humour. The research was by Martin Daumiller and three other colleagues at Augsburg.

Teaching in lifelong learning: A guide to theory and practice

The third edition was published in 2019, edited by James Avis, Roy Fisher and Ron Thompson (all Huddersfield).

A conceptual framework to enhance student learning and engagement

Alice Brown, Jill Lawrence, Marita Basson and Petrea Redmond (all Southern Queensland) had an article in Higher Education Research and Development (online 28 December 2020) about using course learning analytics (CLA) and nudging strategies, based on “a 12-month research project, as well as by the theoretical perspectives presented by communication and critical literacies. These perspectives were applied to develop a conceptual framework which the authors designed to prioritise expectation management and engagement principles for both students and academics. The article explains the development of the framework as well as the elements and key communication strategies it embodies. The framework contributes to practice by explaining and justifying the accessible, time-efficient, student-focused approaches that can be integrated simply into each course’s online learning pedagogy to support both academics’ and students’ engagement.”

Rob Cuthbert is the editor of SRHE News and Blog, emeritus professor of higher education management, Fellow of the Academy of Social Sciences and Fellow of SRHE. He is an independent academic consultant whose previous roles include deputy vice-chancellor at the University of the West of England, editor of Higher Education Review, Chair of the Society for Research into Higher Education, and government policy adviser and consultant in the UK/Europe, North America, Africa, and China. He is current chair of the SRHE Publications Committee.