srhe

The Society for Research into Higher Education

Ian Mc Nay


Leave a comment

Ian McNay writes…

by Ian McNay

My main concern in this post is about academic freedom, free speech and surveillance. I write as one who worked on Open University courses in the 1980s (under a previous Conservative administration) which were investigated for alleged Marxist bias.

Bahram Bekhradnia, in a recent HEPI blog, has identified the small number of complaints about free speech on campus which have provoked a government response, and the ideological base of those making them. The same was true in my experience – single figure numbers of complaints about courses with 5,000 students a year in some cases and many more thousands of ‘drop-in’ viewers and listeners for OU/BBC programmes. The allegations were found to be unjustified, but led to significantly increased levels of internal monitoring and accountability. The BBC staff were very scared of possible government sanctions.

For one radio programme, Rosemary Deem, an SRHE notable, was barred from contributing because she was, as I then was, a member of the Labour party. Two was too many. I was forced to accept a distasteful right-winger, who insisted that his contribution – denying Tory cuts to education budgets – could not be criticised, questioned, commented upon, nor edited. The new rules said that all elements of a course had to be internally balanced – not one programme putting one point of view and a second with another. Ditto for course units. The monitoring group said the programme was biased, lacked balance, and should not be broadcast. I said that students were intelligent enough to recognise his ‘pedigree’ and it went out.

In 1988, another programme, on the Great Education Reform Bill, was broadcast at 2am. We arrived later that morning to a phone message from DES demanding a right of reply. The programme had featured John Tomlinson’s comments/critique. He was the Chief Education Officer of Cheshire, hardly a hotbed of revolution. We pointed out that DES staff had been sitting in our ‘classroom’ and that comments could be made to their tutor, discussed with fellow students in self-help groups, and used, if evidenced, in assessments.

My concern is that, as someone who writes on policy and its impact, my work can be seen as ‘disruptive’ [a basic element of much research and education] and ‘causing discomfort and inconvenience’ to some people – mainly policy makers. Those terms are from the current draft bill on police, crime, sentencing and courts, which aims to limit public demonstrations of dissent. Given trends in other countries, and government resistance to a more balanced view of history, I wonder how long it will be before there is more overt intrusion – by OfS? – into controlling the curriculum and suppressing challenging, but legitimate views. In the OU, Marxist critique disappeared for years, as self-censorship operated to avoid recurrent problems of justification. It could happen again.

That goes alongside recent developments with Microsoft surveillance which are intrusive and irritating. The university has just had an ‘upgrade’. In my experience, such upgrades, like restructuring, rarely improve things, and often do the opposite. I now get daily emails from Cortana, a Microsoft offshoot, saying things like ‘Two days ago you were asked a question by X. Have you replied?’ The answer is that ‘if you are reading my emails, you will know the answer to that question’. Undeterred, this AI avatar offers me advice on how to organise my coming week, blithely ignorant that I have only a 0.2 contract. When it says I have 85% of my time ‘spare’, that implies that of my 20% load, only 5% that week was not observable. Its daily plan for me is to spend 2 hours in the morning, ‘focus time…to get your work done’.

The rest is spent not getting my work done, but on email and chats, taking a break and lunchtime and two hours to learn a new skill and develop my career. Wow! Do those in charge of the balanced academic workload know about this prescription? It also believes that all emails are ‘developing your network … you added 23 new members to your networks last week’. A computer network must be much less demanding than my criteria require for the term. Its autonomous, unaccountable and unexplained treatment of my emails includes frequently deleting when I click to open one, and designating as ‘junk’ PDF journal articles relevant to my work sent by Academia. I then have to spend time digging around to find both of these. It also merges emails into a stream so that finding one of them needs a memory of the last one in the stream – often an automatic reply. More time spent digging around.

Then there are the constant disruptive phone calls to verify my sign in. The automated voice advises me that ‘if you have not initiated this verification, you should press such-and-such a key’. I did that, twice, once when two such calls came within 50 seconds of one another, which I thought suspicious. How simple minded I was! The ‘solution’ was to bar me from access until the systems administrators had sorted things. That meant a full day or more in each case. The two most recent calls even came when I had not moved to laptop based work, and I now no longer log out, so I do not sign in, but leave the machine on all day every day which may not be good ecologically, but it helps my mental health and state of mind.

I accept the need for computer security, with university generated messages warning about emails from sources outside the university, such as OfS, AdvanceHE, or HEPI and Research Professional through a university subscription, asking if I trust them. Up to a point, Lord Copper. But balance is the key. I knew there was surveillance – in a previous institution a NATFHE e-mail was held up to allow management to get its reply in simultaneously with its being sent. This, though, is blatant and overt. I suppose that is better than it being hidden, but it is neither efficient nor effective. Am I the only one experiencing this, a human being balancing Marvin, the paranoid android, or do others have similar experiences? If the latter, what are we going to do about it? It has implications for research in terms of the confidentiality of email interviews, for example.

And, finally, on a lighter note … my local optician has a poster in the window advertising ‘Myopia Management’. That sounds like a module to include in the leadership programmes that some of us run.

SRHE Fellow Ian McNay is emeritus professor at the University of Greenwich.

Image of Rob Cuthbert


Leave a comment

Cronyism, academic values and the degradation of debate

by Rob Cuthbert

The pandemic has accelerated many trends which were already apparent, such as the switch away from the high street to online purchasing, and in HE the move to on-line, remote and asynchronous learning. The influence of social media has also accelerated, partly or wholly replacing the normal policy business of face-to-face discussion and debate. But perhaps the most significant change of all for HE has been the accelerating decline in the quality of regulation, governance and policy debate.

The Higher Education and Research Act 2017 may come to be seen as the high water mark of a particular kind of policymaking which has been ebbing rapidly ever since: the tide has gone out on deliberative and measured debate. A majority in HE strongly opposed marketisation, but the Act was the culmination of a long period of debate which at least gave credence to opposing views and saw them represented in discussion inside and outside Parliament. The market ‘reforms’ were promoted by ministers – David Willetts and Jo Johnson in particular – who had at least grudging respect from many in the system, because of their own respect for academe, however partial it sometimes seemed. And much though we might regret the marketisation changes and seek their reversal, we might also accept that they were enacted by a government which had a mandate for change explicitly endorsed by the electorate. But that was then.

In 2019 the government was returned with a sufficient majority to ‘get Brexit done’, which it did, much to the dismay of most in higher education. HE’s dominant Remainer sentiment no doubt helped to fuel disregard in Whitehall for HE opinion. What is often wrongly still called ‘debate’ has been polarised, accentuated by social media’s echo chambers during the lockdown. In the ‘culture war’ both sides have dug their trenches and hoisted the ‘no surrender’ flags. In HE this has diverted attention away from the real and massive problems of the student experience in the pandemic, and towards the misrepresented and overstated issue of free speech, academic freedom and diversity of opinion. The supposed justification for recent free speech initiatives in HE has been amply covered elsewhere, and is summarised in SRHE News 44 (April 2021).

In this culture war academics and academic institutions have their share of blame. The Policy Exchange ‘research’, cited in support for the Secretary of State’s recent announcements, shoddy though it was, nevertheless pointed to the issue of Remainer conformism in much British academic culture, in which some staff have self-censored their support for Brexit. I tried much earlier to parody this conformism, arguing that “perhaps the best thing to do was to accept the will of the people, freely expressed”. But democracy depends on the willing consent of the governed, and the governed in HE are increasingly unwilling to consent to changes in which their views are simply ignored. There is no shortage of comment on new policy initiatives; the HE sector is comparatively well-served by think tanks such as HEPI and WonkHE, as the recent CGHE seminar on ‘Universities in Medialand’ suggested. But there is little sign that government takes note of policy commentary which contradicts its current narrative, even when obvious contradictions are pointed out. Thus, for example, market forces must rule, except when students choose the ‘wrong’ universities. The student experience is paramount, except  when students report high levels of satisfaction – so the National Student Survey, until yesterday a crucial element for teaching excellence, must today be rubbished.

Nowhere has the contempt for opposing views been more obvious than in the appointment of a new Chair for the Board of the Office for Students. The notes to the 2017 Act establishing the OfS explained that: “This Act creates a new non-departmental public body, the Office for Students (OfS), as the main regulatory body, operating at arm’s length from Government, and with statutory powers to regulate providers of higher education in England.” (emphasis added). The first OfS chair was Sir Michael Barber. It was rumoured that Barber sought a second term but was denied. Who might be appropriate to take on the role? Another respected figure with experience of HE and of working with government, able to sustain that arm’s length role for the Office? Former UUK chair Sir Ivor Crewe (former VC, Essex) was interviewed, as Sonia Sodha and James Tapper reported for The Observer on 14 February 2021: “Perhaps it was the long passage in Professor Sir Ivor Crewe’s book The Blunders of Our Governments about the way ministers’ mistakes never catch up with them that led Gavin Williamson to reject the expert as the new head of the Office for Students. Or maybe the education secretary was put off by the section of the 2013 book, written with the late Anthony King, dealing with how ministers put underqualified, inexperienced people in charge of public bodies. The job of independent regulator of higher education in England was instead handed to James Wharton, a 36-year-old former Tory MP with no experience in higher education who ran Boris Johnson’s leadership campaign.” The selection panel had been criticised for its dominant reliance on government supporters rather than HE expertise, but the chair-designate was nevertheless still to have his appointment endorsed by the Parliamentary Education Select Committee.

The Committee’s approval was very likely but could not be taken for granted, and Nick Hillman made some sensible proposals in his HEPI blog on 12 January 2021 on ‘How to grill the prospective chair of OfS’. We’d have suggested grilling on both sides, but presumably Boris Johnson’s campaign manager only has one side. The Education Select Committee duly questioned Lord Wharton of Yarm on 5 February 2021 and endorsed his appointment, which was announced by OfS on 8 February 2021. Rob Merrick reported for The Independent on 2 February 2021 that Lord Wharton had been subject to ‘hard questioning’, in the course of which he said he didn’t see why he could not retain the whip, nor why his role as Boris Johnson’s campaign manager should raise any conflict of interest issues.

So the ‘independent’ regulator was to have a partisan chair who proposed to retain the government whip. Conflict-of-interest issues raised themselves almost immediately, with wider ripples than expected. Lord Wharton had just been installed as Chair when he was revealed to be a paid adviser to a company seeking to build a cable connection through land at the University of Portsmouth. The company, Aquind, has a £1.2billion project to connect the electricity grids of the UK and France. It wants to put a cable across University of Portsmouth land, which the University opposes because of the disruption it would cause. Portsmouth Council and local Conservative and Labour MPs all oppose the project. Aquind director Alexander Temerko is a Conservative Party donor, whose website has several pictures of him with Lord Wharton, and also pictures him with the Prime Minister and Secretary of State Gavin Williamson. The planning dispute, involving possible compulsory purchase, has reached the Secretary of State for Business, but the previous incumbent Alok Sharma had to recuse himself from the case because his constituency party had received £10,000 from Temerko. Sean Coughlan told the story for the BBC on 19 February 2021, noting also that: “Conservative MP David Morris, another recipient of a donation, had to apologise to the House of Commons for a breach of paid advocacy rules after asking a question in support of the Aquind cable project.”

Lord Wharton’s appointment was greeted with incredulity in HE, but attracted little interest more broadly; in macropolitical terms the chair of OfS is a small bauble. And there were of course already many higher-profile reports of cronyism in government. The difficulty for HE is that the regulator may now be driven further and faster to unrealistic extremes. OfS, obediently pursuing its statutory responsibilities and ‘having regard to ministers’, is already in danger of leaving HE realities behind:

  • On 14 January 2021 the OfS wrote to universities and other HE providers, hard on the heels of a DfE letter to OfS, saying that the regulator expected institutions “to maintain the quality, quantity and accessibility of their provision and to inform students about their options for refunds or other forms of redress where it has not been possible to provide what was promised.” Universities are losing tens of millions every week during the lockdown, without the kind of support provided for many other sectors, and on student hardship “the government can never quite resist overselling the multiple purposes to which the money might meaningfully be put”, as David Kernohan and Jim Dickinson argued in their WonkHE blog on 2 February 2021.
  • The OfS consultation document issued on 26 March 2021 put into practice the ‘instructions’ received earlier from Secretary of State Gavin Williamson. It proposed to steer more funds to STEM subjects and, among other things, halve additional funding for performing arts, media studies and archaeology courses. WonkHE’s David Kernohan was quick off the mark with his critical analysis on 26 March 2021.
  • OfS announced on 30 March 2021 that after the first phase of a review of the NSS, commissioned by Universities Minister Michele Donelan, there would be ‘major changes’ including dropping all references to ‘student satisfaction’. Of course, consistent reports that 85% or more of students in most universities are satisfied with their experience would be embarrassing for a government determined to prove otherwise.
  • OfS Director Regulation Susan Lapworth blogged for WonkHE on 31 March 2021 about a new condition of registration which would allow OfS to step in where a provider was at risk of failure, not to rescue the provider but to prevent a ‘disorderly’ closure. OfS had consulted on the proposal, which was not supported by most respondents, but went ahead anyway. The condition affects only the failing provider. Two obvious problems: (1) failing providers might not be inclined or well-placed to take the protective measures which OfS deems necessary; (2) previous experience shows that students need help from other institutions to facilitate transfers, but the Condition is silent on other institutions. They will often be willing, but might be unable to help without further support.

In the past funding councils were statutorily responsible for in effect providing a buffer between HE and government, to regulate excesses on either side. There is no danger of ‘provider capture’ in the new framework, the risk now is that the arm’s-length relationship with government has very short arms. Recent US experience shows the danger of such closeness. The Obama administration’s tighter regulation of for-profit HE after well-publicised shortcomings were swiftly reversed by Donald Trump’s Education Secretary Betsy DeVos, but Joe Biden is now progressively restoring Obama’s closer regulation. Such to-ing and fro-ing simply creates a more disorderly system for students to navigate.

We can learn a better lesson from the US: Michelle Obama’s dictum “when they go low, we go high”. We need to reinforce our support for academic values across the sector by continuing to show respect for opposing views, and to win cases by argument rather than by seeing who can shout loudest on social media. We have examples in the way that, for example, Eric Lybeck (Manchester) has offered to debate free speech with the authors of the Policy Exchange report. We also need to broaden the base of explicit opposition, and not leave it to the usual suspects: in particular, we need university leaders to step up and speak out more than they do.

It is often true that leaders can be more persuasive in private conversations than public speeches, but in current circumstances leaders, especially vice-chancellors, need to be more concerned that they will lose the confidence of staff and students if they fail to speak out publicly. There are honourable exceptions, but too many vice-chancellors seem to be more interested in avoiding blame than speaking out about real problems. It is certainly not easy, operating in the space between government, staff or student disapproval and social media pile-ons from the left or right; just one past or present remark or action, if uncovered or reinterpreted, could be career-ending. But that is why our leaders are well paid – to pursue the best interests of the institution and the people in it, not to be silenced just because the  problems are very difficult, nor out of fear or self-interest. We have recently seen research leaders not hesitating to speak out about proposed cuts in research funding – and those cuts have now been reversed. We need more people, leaders and staff on all sides, to speak truth to power – not just playing-to-the-gallery ‘our truth’, but a truth people inside and outside HE will find persuasive.

Rob Cuthbert is an independent academic consultant, editor of SRHE News and Blog and emeritus professor of higher education management. He is a Fellow of the Academy of Social Sciences and of SRHE. His previous roles include deputy vice-chancellor at the University of the West of England, editor of Higher Education Review, Chair of the Society for Research into Higher Education, and government policy adviser and consultant in the UK/Europe, North America, Africa, and China.

Marcia Devlin


Leave a comment

Making space for compassion

by Marcia Devlin

As is the case in many countries, the COVID pandemic continues to wreak havoc in Australian universities. While many universities are now beginning to experiment with ‘hybrid’ models that combine online and face-to-face teaching and learning, efforts are tentative. Executives and staff are nervous about committing to what many students are increasingly telling us they want – a ‘normal’ university student experience with on-campus components.

This nervousness is well-founded. Our vaccine rollout is not rolling, the federal and state governments are blaming each other, and we have had to have another snap lockdown just this week. Ensuring ‘COVID-safe’ campuses in these circumstances is tricky and, not to put too fine a point on it, connected to potential life and death scenarios.

International borders remain closed. International students – so important to Australian universities and their finances – are not allowed into the country. The current Education Minister gave a speech about the future of international students this week. It was invitation only but from what can be gleaned from social media commentary from those fortunate enough to secure an invitation, it didn’t leave audience members brimming with confidence about the immediate future.

In this set of circumstances, it is challenging to focus on the core university ‘businesses’ of teaching and research. This challenge is exacerbated by the fact that those providing the education and research are human beings who are themselves living in and through the pandemic. The work of academics and professionals in universities is complex, messy, deeply human and relies on individual passion and goodwill as well as qualifications, knowledge, skills and experience.

I attended a seminar recently at one of my alma maters, Macquarie University, led by a well-known Australian author, Hugh McKay: the importance of compassion was central. Arguing that the most significant thing about us as people is that we share a common humanity, that we humans all belong to a social species, that we are “hopeless” in isolation and that we need others to nurture and sustain us, McKay underscored the importance of compassion, kindness and simply being nice to one another in our current shared pandemic context.

I’m not sure about other SRHE readers, but compassion and kindness aren’t topics I’ve often heard discussed in universities in my 30 years in the sector. McKay suggested the pandemic has been a mass experiment around what happens to people when they are isolated. The results have included more anxiety, more suicidal ideation, more domestic violence, among many other negative outcomes. But also more time for introspection and for deep consideration of what is important to us. Many of us have more clearly understood how crucial our social and personal connections are.

McKay proposes that many of us have previously found useful hiding places in ambition, IT devices and consumerism, which have promoted individualism and competitiveness and a greater focus on ourselves than on our role in families, communities and society. As I reflected on university life, and life more generally, I couldn’t help but think he had a point.

As we co-create the ‘COVID-normal’ university, I wonder if we might all find a bigger space for our humanity, our compassion and our kindness to each other. Not only might that bring a better experience of work in universities for ourselves and those around us, the quality and impact of our education and research might also improve as a result.

Former Senior Deputy Vice-Chancellor Marcia Devlin is a Fellow of SRHE and an Adjunct Professor at Victoria University, Melbourne, Australia.


Leave a comment

Dupery by Design

by Petar Jandrić

Since the election of a number of right-wing populist governments across the world, there have been increasing concerns that fake news in online platforms is undermining the legitimacy of the press, the democratic process, and the authority of sources such as science, the social sciences and qualified experts. The global reach of Google, YouTube, Twitter, Facebook, and other platforms has shown that they can be used to spread fake and misleading news quickly and seemingly without control. In addition to their power and reach, these platforms operate, and indeed thrive, in what seems to be an increasingly balkanised media eco-system where networks of users will predominantly access and consume information that conforms to their existing worldviews. Conflicting positions, even if relevant and authoritative, can be suppressed, discredited or overlooked as a result of filter bubbles and echo chambers.

Digital technologies have contributed to the prolific spread of false information, encouraged ignorance in online news consumers, and fostered confusion about how to determine fact from fiction. These same technologies have, however, permitted marginalised voices to be heard (transgender and autistic communities, victims of street harassment, for example), encouraged diversity, facilitated error detection and investigative accountability, and challenged privilege and prejudice. This opens up myriad questions such as:

  • How are online platforms designed to exploit particular vices such as close-mindedness, epistemic nihilism, insouciance, etc. and contribute to the power and dissemination of deception?
  • Deception: what is it? Is there anything peculiar about the times in which we live that should raise special concerns about the proliferation of fake news, lies, bullshit and other such vices online?
  • How do our individual and collective epistemologies interact with digital technologies to produce deceit?
  • How can we counter epistemic vices online, and protect ourselves and our institutions from their potentially baneful effects?
  • Can deception ever be justified? Is there anything to be learned from mass propaganda and deceit in other historical periods?

The epistemology of deceit in a postdigital era

To address these and related questions, Alison MacKenzie, Jennifer Rose, and Ibrar Bhatt have edited a book The Epistemology of Deceit in a Postdigital Era: Dupery by Design. The book offers strong theoretical and philosophical insight into how digital platforms and their constituent algorithms interact with belief systems to achieve deception, and how related vices such as lies, bullshit, misinformation, disinformation, and ignorance contribute to deception. This inter-disciplinary collection explores how we can better understand and respond to these problematic practices.

Continuing editors’ earlier work in the Special Issue of Postdigital Science and Education, ‘Lies, Bullshit and Fake News Online: Should We Be Worried?’, the contributors to the collection discuss the diverse ways in which deception is a pervasive feature of our communicative lives. Among the issues explored are how the design and infrastructure of digital platforms enable (or disable us from distinguishing between) what is true and truthful; fake or real; informative, disinformative or misinformative, malinformative, and other such information disorders. The scale of the dupery impacts on human rights, individual freedoms and dignity, agency and autonomy, in addition to the harms mentioned above.

The role of higher education is critical within this context, as universities have traditionally been regarded as sites of epistemic authority where knowledge is created and disseminated through the work of academics and theoretically grounded systems of teaching. Recent trends have shown that universities market the idea that an education through them will create ‘future-ready’, ‘globally-aware’ and ‘critically-thinking’ graduates, equipped with the relevant skills and knowledge to deal with issues facing our modern world, including public health crises, climate change and conflict.

The book was launched at a successful SRHE event held on 16 March 2021, in which editors, authors, and more than 100 members of the public engaged in a vivid discussion.

What is next?

These days, there is really interesting research taking place in different fields about post-truth and online deceit. Closer to higher education, and interesting example is Michael A Peters, Sharon Rider, Mats Hyvönen, and Tina Besley’s popular book Post-Truth, Fake News: Viral Modernity & Higher Education, which discusses the meaning and purpose of higher education in a ‘post-truth’ world.

Aided by a unifying postdigital theoretical framework which holds that human beings are systematically embedded in digital infrastructures, Alison MacKenzie, Jennifer Rose, and Ibrar Bhatt in The Epistemology of Deceit in a Postdigital Era: Dupery by Design make a unique contribution by reaching interdisciplinary boundaries to explore, examine and counter online deception, and analysing the power of social platforms and their role in the proliferation of epistemic harms. This line of inquiry is in its early days, and it will be very interesting to see where it will develop in the future.

Petar Jandrić is a Professor at the University of Applied Sciences in Zagreb (Croatia), Visiting Professor at the University of Wolverhampton (UK), and Visiting Associate Professor at the University of Zagreb (Croatia). His research interests are focused to the intersections between critical pedagogy and information and communication technologies. He co-authored the chapter ‘Scallywag Pedagogy’ with Peter McLaren (Chapman University, California) in Post-Truth, Fake News: Viral Modernity & Higher Education. pjandric@tvz.hr


1 Comment

Widening participation, student engagement, alienation, trauma and trust

Caroline S Jones and Zoë Nangah

Social mobility target setting and progression data collection have long been on the agenda for UK HE policy makers and are widely documented, debated and researched (Connell-Smith and Hubble, 2018; Donnelly and Evans, 2018; Social Mobility Commission 2017, 2019; Phoenix, 2021). Widening Participation (WP) policy underpins much Government target setting, dressed up as a key factor in improving the nation’s social mobility issues.  Much of the work undertaken in this field focuses upon the recruitment of students from the WP demographic onto Higher Education (HE) programmes, with data tracking at key points of the student’s journey as a measuring tool (Vignoles and Murray, 2016; Robinson and Salvestrini, 2020; Phoenix, 2021).  However, there appears to be a distinct lack of focus on the student as an individual human being, who arrives into the HE world with prior lived experience, and a lack of consideration of the impact of future life experiences aligned to the student’s individual psychological status.

This omission can have a profound effect on a student’s ability to engage in their programme of study, thus affecting their ability to progress and succeed, contributing to barriers to engagement (Jones and Nangah, 2020). On-entry assessment currently does not capture the presence of traumatic histories, and students may not feel able to fully disclose their experiences until they have established a tutorial connection. Furthermore, HE systems may not have access to information, either on-entry or during studies, that enables appropriate tutorial support and adequate referral, due to GDPR (2018) restrictions and confidentiality principles. Therefore, academic tutorial expertise and understanding how to support students from a psychological perspective might need to be considered using specific relational elements in a humanistic manner. At system level, internal and external support for students focusing on their holistic needs might also improve access and progression.

These ideas led us to conduct a deeper investigation into the psychological needs of students, to seek out methods, practices and potential policy changes which might reduce barriers to student engagement. This new knowledge could enable policy makers, HEIs, HE staff and departments to improve their current practice and  strengthen progress in terms of the national social mobility agenda (Augar, 2019). Examining barriers to student engagement for the WP demographic and specifically focusing on the links between psychological alienation theory (Mann, 2001), trauma and trust (Jones, 2017) in the HE context, led us to this new angle on the conundrum of meeting social mobility targets. Furthermore, recent neurological research, such as brain and amygdala responses to threat within specific groups (Fanti et al, 2020), could be explored further within HE student populations. Students who are affected by trauma could be better supported by using research-informed practices that can then be embedded in HE, focused on individual requirements.

To making a difference to current social mobility rates and targets we need to explore new concepts to inform and drive change in the sector. Our systematic literature review (Jones and Nangah, 2020) focused on the analysis of links between alienation theory (Mann, 2001; Jones, 2017), experiences of prior, existing or present traumatic experiences and the student’s ability to trust in the academic systems within which they are placed. The presence of traumatic emotional experiences in WP student populations connected to psychosocial and academic trust alienation theory contributes to understanding engagement barriers in HE. Using PRISMA guidelines, 43 publications were screened based on inclusion/exclusion criteria. Our review identified students’ experiences of trauma and how this had affected their HE educational engagement. It documented support strategies for student success and improvements in HEIs’ commitment to meeting WP agendas. This underlined the need for HEIs to commit to the social mobility agenda in a way which is aligned with barriers to student engagement. Current tracking and support systems may need to be augmented by tutorial systems and training for academic staff in relational tutorial systems, emphasising the presence of a consistent tutor. Jenkins (2020) suggests a single-session approach for addressing student needs within a short-term counselling model, but recognises this may not be suitable for students with more complex requirements. Thus, longer-term interventions and individualised counselling support approaches are arguably needed to support this demographic. 

To decrease barriers to student engagement we need to focus on psychological well-being and collaborative HEI strategies to improve recruitment, retention and ultimate success. Our systematic review argued that deeper understanding of the complexities of student needs should be embedded within HE teacher training programmes and curriculum delivery. Extending teaching skills to embed psychological understanding and practice delivery skills would not only work to meet Government targets but also raise aspirations: ‘ …with the right approach, the transmission of disadvantage from one generation to the next can be broken’ (Social Mobility Commission, 2017: 8).  Fulfilling the moral and corporate responsibility of HEIs to support the success of WP students might need new insights. Focusing on student engagement in HE with a better understanding  of psychological alienation theory, trauma and trust could be used by multiple HE audiences and across countries to improve practice and drive both political and educational change for the most disadvantaged individuals. It is time to view HE students from WP backgrounds as individuals, to respect their aspirational aims and value their experiences in a way that best suits their subjective requirements, so that they may progress and  succeed, helping to improve social mobility.

SRHE member Caroline S Jones is an applied social sciences professional with extensive experience in the children and young people field and HE programme leadership. She is a Tutor in the Education Faculty at Manchester Metropolitan University and was previously a Lecturer at the University Campus Oldham and at Stockport University Centre. Twitter: @caroline_JonesSFHEA. LinkedIn: https://www.linkedin.com/in/caroline-jones-1bab40b3/

SRHE member Zoe Nangah has been a Lecturer/Senior Lecturer in HE for 16 years across Psychology, Social Sciences, Counselling and Childhood Studies disciplines. She is currently a Senior Lecturer and Course Leader at the University of Chester for the MA Clinical Counselling course. Zoe is a qualified counsellor and supervisor and has conducted research into emotional experiences within student populations and explored perceptions of the support services. Twitter @zoenangah 

References

Fanti, KA, Konikou, K, Cohn, M, Popma, A and Brazil, IA (2020) ‘Amygdala functioning during threat acquisition and extinction differentiates antisocial subtypes’ Journal of Neuropsychology, Volume 14, Part 2. (June 2020) 226-241, British Psychological Society

Jenkins, P (2020) ‘Single session formulation : an alternative to the waiting list’ University and College Counselling Volume 8, issue 4, November 2020

Mann, SJ (2001) ‘Alternative Perspectives on the Student Experience: Alienation and Engagement’ Studies in Higher Education 26 (1): 7–19

Robinson, D and Salvestrini, V (2020) The Impact of Interventions for Widening Access to Higher Education London: Education Policy Institute: TASO

Social Mobility Commission (2017) State of the Nation 2017: Social Mobility in Great Britain London: Social Mobility Commission

Social Mobility Commission (2019) State of the Nation 2018-2019: Social Mobility in Great Britain London: Social Mobility Commission


1 Comment

In search of the perfect blend – debunking some myths

by Ling Xiao and Lucy Gill-Simmen

A blended learning approach has been adopted by most UK universities in response to the Covid pandemic. Institutions and higher education educators have become fully committed to and invested enormous resources in delivering such an approach. This has helped to maintain a high quality education and to some extent, to mitigate the potential dire impact of the pandemic on the students’ learning experience. This approach has no doubt accelerated a reshaping of pedagogic approaches, facilitating deep learning, autonomy, personalised learning and more. With the rapid pace of the UK’s vaccine rollout, and the semi promise by the UK Government that we’ll be back to some kind of normal by the end of June, there is hope for a possible return to campus in September, 2021.  As a result, this now marks a time when we need to reflect on what we have learned from the blended learning approach and figure out what to take forward in designing teaching and learning post-pandemic, be it hybrid or hyflex or both.

The Management Education Research Group (MERG) in the School of Business and Management at Royal Holloway, University of London recently held a symposium on ‘Reflecting on Blended Learning, What’s Next?’. It showcased blended learning examples from various universities across the UK and was followed by a panel discussion where we posed the question: what worked and what didn’t work? We found that some of our previous assumptions were brought into question and a number of myths were debunked.

Myth 1: Pre-Recorded videos should be formal and flawless

Overnight, and without any training, educators took on the role of film creators, film editors and videographers. Spending hours, days, weeks and even months developing lecture recordings from the confines of our home working spaces we were stopping, starting, re-starting, editing out the slightest imperfection. It has to be perfect, right? Not so fast.

For many institutions, recorded video is the primary delivery vehicle for blended learning content, with academics pre-recording short presentations, lectures and informal videos to complement text-based information and communication. Many of us postulated that a formal and meticulous delivery style for pre-recorded videos is required to help to maintain high quality educational materials and for students to perceive us as professionals. Academics’ personal experiences however suggest it is vital to keep the human elements, as students enjoy and engage better with a personalised delivery. A personalised style helps to build relationships with students which then provide foundations of learning. Mayer (2009) describes it as a personalisation principle of learning through video and recommends that a conversation style is preferential to a formal style for learning. This also resonates with recent insights from Harvard Business School Professor, Francesca Gino, who reflects in her webinar on the power of teaching with vulnerability in COVID-19. She explains the importance of being open, honest and transparent with students and sharing one’s own human side in order to strengthen the educator-learner bond.

Myth 2: Students enjoy learning from the comfort of their homes

Blended learning empowers students to become autonomous learners, since they can engage with their courses when real-time contact with lecturers is not possible. However, such autonomy isn’t all it’s cracked up to be, and turns out to be a lonely road for many students. Instead of relishing staying at home and learning when they want, some students declare they miss the structure, the sense of community and the feeling of belonging they associate with attending university in person.

Universities are more than places for learning, they serve as the centre of their communities for students. Students not only learn directly from the education but also, just as much, from interaction and collaboration with lecturers or their fellow classmates. It emerged in conversation between students and faculty that students felt it generally took longer to establish a sense of community in an online class than in a traditional face-to-face classroom but that it could be achieved. So, it’s up to us as educators to foster a sense of community amongst online learners.

Central to learning community is the concept of cooperative learning, which has been shown to promote productivity, expose students to interdisciplinary teams, foster idea generation, and to promote social interaction (Machemer, 2007). One such technique is to introduce collaborative learning opportunities, and those that reach beyond online group work and assessment – which in itself may prove problematic for learners. Instead, educators should look to develop co-creation projects such as wikis or blogs where students can come together to cocreate content together. Social annotation platforms such as Google Docs and Padlet enable students to share knowledge, develop the understanding of learning objects through collaborating on notes, commenting specific parts of materials, etc (Novak et al, 2012; Miller et al, 2018). Padlet for example has proved to be particularly popular with students for collaborative learning given its ease of use.

Myth 3 :  It makes sense to measure student engagement merely by participation metrics

After months of preparation and instructional design and preparing the perfect learning journey for students, we tend to expect students to learn and to engage in a way that we as educators perceive to be optimal for fulfilment of learning outcomes.

We all know that students learn and engage in many different ways, but we often find ourselves trawling the data and metrics to see whether students watched the videos, engaged in the readings we provided, posted on the fora we clearly labelled and participated in the mini quizzes and reflection exercises we created. However, as our hearts sink at what appears to be at times a relatively low uptake, we jump to the conclusion that students aren’t engaging. Here’s the thing:  they are, we just don’t see it. Engagement as a construct is something far more complex and multi-faceted which we can’t necessarily measure using the report logs on the VLE.

Student engagement is often labelled the “holy grail of learning” (Sinatra, Heddy and Lombardi, 2015: 1) since it correlates strongly with educational outcomes, including academic achievement and satisfaction. This can therefore lead to a level of frustration on the part of educators when engagement appears low. However, engagement comes in many forms, and in forms which are often not directly visible and/or measurable. For example, cognitive, behavioural and emotional engagement all have very different indicators which are not immediately apparent. Hence new ways of evaluating of student engagement in the blended learning environment are needed. Halverson and Graham (2019) propose a possible conceptual framework for engagement that includes cognitive and emotional indicators, offering examples of research measuring these engagement indicators in technology mediated learning contexts.

Myth 4: Technology is the make or break for blended learning

The more learning technologies we can add to our learning design, the better, right? Wrong. Some students declared the VLE has too much going on; they couldn’t keep up with all the apps and technologies they are required to work with to achieve their learning.

Although technology clearly plays a key role in the provision of education (Gibson, 2001; Watson, 2001; Gordon, 2014), it is widely acknowledged that technology should not determine but instead complement theories and practices of teaching. The onset of Covid-19 has shifted our focus to technology rather than pedagogy. For example, educators felt an immediate need for breakout room functionality: although this can be a significant function for discussion, this is not necessarily the case for disciplines such as accounting, which requires students continuously to apply techniques in order to excel at applied tasks. Pedagogy should determine technology. The chosen technology must serve a purpose and facilitate the aim of the pedagogy and should not be used as bells and whistles to make the learning environment appear more engaging. In our recent research, we provide empirical evidence for the effective pedagogical employment of Padlet to support learning and engagement (Gill-Simmen, 2021). Technology has an impact on pedagogy but should not be the driver in a blended or hybrid learning environment. Learning technologies are only applicable and of value when the right content is presented in right format and right time.

In summary, we learned these lessons for our future approach to hybrid learning:

  1. Aim for ‘human’ not perfection in instructional design
  2. Students don’t want to learn alone – create opportunities for collaborative learning
  3. Student engagement may not always be measurable – consider tools for assessing emotional, cognitive and behavioural engagement
  4. Technology should support pedagogy, not vice versa – implement only those technologies which facilitate student learning

SRHE member Dr Ling Xiao is the Director of MERG and a Senior Lecturer in Financial Management at Royal Holloway, University of London.  Follow Ling via @DrLingXiao on Twitter.

SRHE member Dr Lucy Gill-Simmen is a Senior Lecturer in Marketing at Royal Holloway, University of London and Program Director for Kaplan, Singapore. Follow Lucy via @lgsimmen on Twitter.

References

Gill-Simmen, L. (2021). Using ‘Padlet’ in Instructional Design to Promote Cognitive Engagement: A Case Study of UG Marketing Students, (In Press) Journal of Learning Development in Higher Education.

Machemer, P.L. (2007). Student perceptions of active learning in a large cross-disciplinary classroom, Active Learning in Higher Education, 8(1): 9-29.

Mayer, R. E. (2009). Multimedia learning Cambridge, England: Cambridge University Press (2nd edn).


Leave a comment

An SRHE playlist

by Leo Goedegebuure

There are many ways of communicating. Text always has been our main medium, but the last year has clearly shown that there are other ways. One of the most popular articles in the recent special issue of Studies in Higher Education on the impact of the pandemic was Amy Metcalfe’s photo-based essay. We had a massive SRHE webinar on the contributions, with a truly global audience. Taking David Bowie’s Sound and Vision to the extreme, we have done the Vision but we haven’t done the Sound.

2021 will be another special year. By the end of the year we will still not be able to come together face-to-face at the 2021 SRHE conference, although an exciting alternative kind of conference is being planned. It will be good to have a decent soundtrack for the event. So we thought we might kick this off with a bit of advanced planning – and activity. Last minute work can be a bit tedious and stressful. So we propose a two-pronged approach to this. We’ll start by inviting this year’s contributors to the Studies in Higher Education special issue to submit their 5-song playlist in addition to their accepted and on-line published article. And we invite all the readers of this blog to do the same. What we expect as outcome of this fun and silly project is a reflection of the diversity of our community in music.

So let me kick this off. The basic model is: song and a brief one-sentence reason why, plus Spotify link.  Here we go:

1 Amy Macdonald – Let’s Start a Band                                   

The obvious opener for a project like this

2 David Bowie – Life on Mars                                                     

The amazing achievement of the Mars Perseverance Rover so far and a tribute to one who left too early

3 Bruce Springsteen – The Ghost of Tom Joad                     

Too many ghosts of 2020 and a brilliant contribution of Tom Morello

4 REM – Nightswimming                                                              

Quietly avoiding restrictions without creating chaos and such a great song

5 Vreemde Kostgangers – Touwtje uit de Deur                   

My Dutch heritage; the literal translation is “A Rope from the Letterbox” reflecting on a time when you could just pull a little rope to enter your neighbour’s house

Amy Metcalfe has also skipped in already with her suggestions, which have been included in the playlist:

1 Snow Patrol – Life on Earth

“This is something else.”

2 Foster the People – Imagination

“We can’t change the things we can’t control.”

3 Haelos – Hold On

“Hold on.”

4 The Weeknd – Blinding Lights

I’ve been on my own for long enough.”

5 Lastlings – Out of Touch

“Don’t want this to fall apart; Is this what you really need?”

There will be more to follow from contributors to the SHE special issue, but everyone is invited to send in their own 5-track playlist to rob.cuthbert@uwe.ac.uk and leo.g@unimelb.edu.au. We will provide updates via the blog at regular intervals, and aim to compile a comprehensive playlist later in the year – which may or may not become lockdown listening, depending on where in the world you are and how your country is faring in the pandemic. We hope you enjoy it.

Leo Geodegebuure is Editor-in-Chief, Studies in Higher Education, Professorial Fellow in the Melbourne Centre for the Study of Higher Education and the Melbourne Sustainable Society Institute, and Honorary Professor in the School of Global, Urban and Social Studies, College of Design and Social Context, RMIT University.


1 Comment

The Social Mobility Index (SMI): A welcome and invitation to debate from the Exeter Centre for Social Mobility

by Anna Mountford-Zimdars and Pallavi Banerjee

There is a new English league table on the block! Welcome! The exciting focus of this new ranking concerns social mobility – the clue is in the name and it is called the Social Mobility Index (SMI). Focusing on social mobility differentiates the SMI from other league tables, which often include dimensions such as prestige, research income, staff qualifications, student satisfaction, and employment outcomes.

The SMI is specifically about an institution’s contribution to supporting disadvantaged learners. It uses the OfS model of access to, progression within and outcomes after higher education. Leaning on a methodology developed for a SMI in the US, the English version contains three dimensions: (1) Access, drawing on the Index of multiple deprivation (IMD); (2) Continuation, using progression data into the second year drawing on IMD; and (3) Salaries (adjusted for local purchasing power), using Longitudinal Education Outcomes (LEO) salary data collected one year after graduation.

The SMI report thoughtfully details the rationale for the measures used and is humble in acknowledging that other measures might be developed that are more useful. But do the reflections of the authors go far enough? Let’s take the graduate outcome LEO data for example. These capture salaries 15 months into employment – too early for an outcome measure. It is also not broken down by IMD, there are heaps of missing data in LEO and those who continue into further study are not captured. Low IMD students may or may not be earning the same sort of salaries as their more advantaged peers. The regional weightings seem insufficient in light of the dominance of high-salary regions of both the US and English SMI. These shortcomings make the measure a highly problematic one to use, though the authors are right to endeavour to capture some outcome of higher education.  

We would like a bolder SMI. Social Mobility is not only about income but about opportunities and choice and about enabling meaningful contribution in society. This was recognised in Bowen and Bok’s (2000) evaluation of affirmative action, which measured ‘impact’ not only as income but as civic contribution, health, well-being.  Armstrong and Hamilton (2015) show the importance of friendship and marriage formation as a result of shared higher education experiences. The global pandemic has shown that the most useful jobs we rely on such as early years educators are disgracefully underpaid. The present SMI’s reduction of ‘success’ to a poor measure of economic outcomes needs redressing in light of how far the academic debate has advanced.

Also, social mobility is about more than class, it is about equal opportunities for first generation students, disabled students, men and women, refugees, asylum seekers, global majority ethnic groups as well as local, regional, national and international contributions. It is also about thinking not only about undergraduate student access, progress and success but about postgraduates, staff and the research and teaching at universities.

A really surprising absence in the introduction of this new SMI is reference to the Times Higher Education Impact Rankings. These are the only global performance tables that assess universities against the United Nations’ Sustainable Development Goals. First published in 2019, this ranking includes a domain on reducing inequality. The metrics used by the Times Higher ranking are: Research on reducing inequalities (27%); First-generation students (23.1%); Students from developing countries (15.4%); Students and staff with disabilities (11.4%); and Institutional measures against discrimination – including outreach and admission of disadvantaged groups (23.1%). The THE ranking celebrates that institutions also contribute to social mobility through what they research and teach. This dimensions should be borrowed for an English SMI in light of the importance attached to research-led, research-informed and research-evidenced practices in the higher education sector.

The use of individual measures in the THE ranking, of those with parents without a background of higher education (first generation students) and those with disabilities, including staff, has merit.  Yes, individual-level measures are often challenging to ‘operationalise’. But this shouldn’t prevent us from arguing that they are the right measures to aspire to using. However the use of first generation students also highlights that the debate in the UK, focusing on area-level disadvantages such as the IMD or POLAR, is different from the international framing of first generation students measuring the educational background of students.

The inclusion of staff in the THE ranking is an interesting domain that merits consideration. For example, data on, for example, the gender pay gap is easily obtainable in England and it would indicate something about the organisational culture. Athena Swan awards or the Race Equality Charter or other similar awards which are an indicator of the diversity and equality in an institution could be considered as organisational commitments to the agenda and are straight-forward to operationalise.

We warmly welcome the SMI and congratulate Professor David Phoenix for putting the debate centre-stage and note that his briefing is already stimulating debate with Professor Peter Scott’s thoughtful contribution to the debate.  It is important to think about social mobility and added value as part of the league table world. It is in the nature of league tables that they oversimplify the work done by universities and staff and the achievements of their students.

There is real potential in the idea of an SMI and we hope that our contribution to the debate will bring some of these dimensions into the public debate of how we construct the index. This will create a SMI that celebrates good practice by institutions in the space of social mobility and encourages more good practice that will ultimately make higher education more inclusive and diverse while supporting success for all.

SRHE member Anna Mountford-Zimdars is Professor of Social Mobility and Academic Director of the Centre for Social Mobility at the University of Exeter. Pallavi Amitava Banerjee is a Fellow of the Higher Education Academy. She is an SRHE member and Senior Lecturer in Education in the Graduate School of Education at the University of Exeter.

Image of Rob Cuthbert


1 Comment

Some different lessons to learn from the 2020 exams fiasco

by Rob Cuthbert

The problems with the algorithm used for school examinations in 2020 have been exhaustively analysed, before, during and after the event. The Royal Statistical Society (RSS) called for a review, after its warnings and offers of help in 2020 had been ignored or dismissed. Now the Office for Statistics Regulation (OSR) has produced a detailed review of the problems, Learning lessons from the approach to developing models for awarding grades in the UK in 2020. But the OSR report only tells part of the story; there are larger lessons to learn.

The OSR report properly addresses its limited terms of reference in a diplomatic and restrained way. It is far from an absolution – even in its own terms it is at times politely damning – but in any case it is not a comprehensive review of the lessons which should be learned, it is a review of the lessons for statisticians to learn about how other people use statistics. Statistical models are tools, not substitutes for competent management, administration and governance. The report makes many valid points about how the statistical tools were used, and how their use could have been improved, but the key issue is the meta-perspective in which no-one was addressing the big picture sufficiently. An obsession with consistency of ‘standards’ obscured the need to consider the wider human and political implications of the approach. In particular, it is bewildering that no-one in the hierarchy of control was paying sufficient attention to two key differences. First, national ‘standardisation’ or moderation had been replaced by a system which pitted individual students against their classmates, subject by subject and school by school. Second, 2020 students were condemned to live within the bounds not of the nation’s, but their school’s, historical achievements. The problem was not statistical nor anything to do with the algorithm, the problem was with the way the problem itself had been framed – as many commentators pointed out from an early stage. The OSR report (at 3.4.1.1) said:

“In our view there was strong collaboration between the qualification regulators and ministers at the start of the process. It is less clear to us whether there was sufficient engagement with the policy officials to ensure that they fully understood the limitations, impacts, risks and potential unintended consequences of the use of the models prior to results being published. In addition, we believe that, the qualification regulators could have made greater use of  opportunities for independent challenge to the overall approach to ensure it met the need and this may have helped secure public confidence.”

To put it another way: the initial announcement by the Secretary of State was reasonable and welcome. When Ofqual proposed that ranking students and tying each school’s results to its past record was the only way to do what the SoS wanted, no-one in authority was willing either to change the approach, or to make the implications sufficiently transparent for the public to lose confidence at the start, in time for government and Ofqual to change their approach.

The OSR report repeatedly emphasises that the key problem was a lack of public confidence, concluding that:

“… the fact that the differing approaches led to the same overall outcome in the four countries implies to us that there were inherent challenges in the task; and these 5 challenges meant that it would have been very difficult to deliver exam grades in a way that commanded complete public confidence in the summer of 2020 …”

“Very difficult”, but, as Select Committee chair Robert Halfon said in November 2020, things could have been much better:

“the “fallout and unfairness” from the cancellation of exams will “have an ongoing impact on the lives of thousands of families”. … But such harm could have been avoided had Ofqual not buried its head in the sand and ignored repeated warnings, including from our Committee, about the flaws in the system for awarding grades.”

As the 2021 assessment cycle comes closer, attention has shifted to this year’s approach to grading, when once again exams will not feature except as a partial and optional extra. When the interim Head of Ofqual, Dame Glynis Stacey, appeared before the Education Select Committee, Schools Week drew some lessons which remain pertinent, but there is more to say. An analysis of 2021 by George Constantinides, a professor of digital computation at Imperial College whose 2020 observations were forensically accurate, has been widely circulated and equally widely endorsed. He concluded in his 26 February 2021 blog that:

“the initial proposals were complex and ill-defined … The announcements this week from the Secretary of State and Ofqual have not helped allay my fears. … Overall, I am concerned that the proposed process is complex and ill-defined. There is scope to produce considerable workload for the education sector while still delivering a lack of comparability between centres/schools.”

The DfE statement on 25 February kicks most of the trickiest problems down the road, and into the hands of examination boards, schools and teachers:

“Exam boards will publish requirements for schools’ and colleges’ quality assurance processes. … The head teacher or principal will submit a declaration to the exam board confirming they have met the requirements for quality assurance. … exam boards will decide whether the grades determined by the centre following quality assurance are a reasonable exercise of academic judgement of the students’ demonstrated performance. …”

Remember in this context that Ofqual acknowledges “it is possible for two examiners to give different but appropriate marks to the same answer”. Independent analyst Dennis Sherwood and others have argued for alternative approaches which would be more reliable, but there is no sign of change.

Two scenarios suggest themselves. In one, where this year’s results are indeed pegged to the history of previous years, school by school, we face the prospect of overwhelming numbers of student appeals, almost all of which will fail, leading no doubt to another failure of public confidence in the system. The OSR report (3.4.2.3) notes that:

“Ofqual told us that allowing appeals on the basis of the standardisation model would have been inconsistent with government policy which directed them to “develop such an appeal process, focused on whether the process used the right data and was correctly applied”.

Government policy for 2021 seems not to be significantly different:

Exam boards will not re-mark the student’s evidence or give an alternative grade. Grades would only be changed by the board if they are not satisfied with the outcome of an investigation or malpractice is found. … If the exam board finds the grade is not reasonable, they will determine the alternative grade and inform the centre. … Appeals are not likely to lead to adjustments in grades where the original grade is a reasonable exercise of academic judgement supported by the evidence. Grades can go up or down as the result of an appeal.” (emphasis added)

There is one crucial exception: in 2021 every individual student can appeal. Government no doubt hopes that this year the blame will all be heaped on teachers, schools and exam boards.

The second scenario seems more likely and is already widely expected, with grade inflation outstripping the 2020 outcome. There will be a check, says DfE, “if a school or college’s results are out of line with expectations based on past performance”, but it seems doubtful whether that will be enough to hold the line. The 2021 approach was only published long after schools had supplied predicted A-level grades to UCAS for university admission. Until now there has been a stable relationship between predicted grades and examination outcomes, as Mark Corver and others have shown. Predictions exceed actual grades awarded by consistent margins; this year it will be tempting for schools simply to replicate their predictions in the grades they award. Indeed, it might be difficult for schools not to do so, without leaving their assessments subject to appeal. In the circumstances, the comments of interim Ofqual chief Simon Lebus that he does not expect “huge amounts” of grade inflation seem optimistic. But it might be prejudicial to call this ‘grade inflation’, with its pejorative overtones. Perhaps it would be better to regard predicted grades as indicators of what each student could be expected to achieve at something close to their best – which is in effect what UCAS asks for – rather than when participating in a flawed exam process. Universities are taking a pragmatic view of possible intake numbers for 2021 entry, with Cambridge having already introduced a clause seeking to deny some qualified applicants entry in 2021 if demand exceeds the number of places available.

The OSR report says that Ofqual and the DfE:

“… should have placed greater weight on explaining the limitations of the approach. … In our view, the qualification regulators had due regard for the level of quality that would be required. However, the public acceptability of large changes from centre assessed grades was not tested, and there were no quality criteria around the scale of these changes being different in different groups.” (3.3.3.1)

The lesson needs to be applied this year, but there is more to say. It is surprising that there was apparently such widespread lack of knowledge among teachers about the grading method in 2020 when there is a strong professional obligation to pay attention to assessment methods and how they work in practice. Warnings were sounded, but these rarely broke through to dominate teachers’ understanding, despite the best efforts of education journalists such as Laura McInerney, and teachers were deliberately excluded from discussions about the development of the algorithm-based method. The OSR report (3.4.2.2) said:

“… there were clear constraints in the grade awarding scenario around involvement of service delivery staff in quality assurance, or making the decisions based on results from a model. … However, we consider that involvement of staff from centres may have improved public confidence in the outputs.”

There were of course dire warnings in 2020 to parents, teachers and schools about the perils of even discussing the method, which undoubtedly inhibited debate, but even before then exam processes were not well understood:

“… notwithstanding the very extensive work to raise awareness, there is general limited understanding amongst students and parents about the sources of variability in examination grades in a normal year and the processes used to reduce them.” (3.2.2.2)

My HEPI blog just before A-level results day was aimed at students and parents, but it was read by many thousands of teachers, and anecdotal evidence from the many comments I received suggests it was seen by many teachers as a significant reinterpretation of the process they had been working on. One teacher said to Huy Duong, who had become a prominent commentator on the 2020 process: “I didn’t believe the stuff you were sending us, I thought it [the algorithm] was going to work”.

Nevertheless the mechanics of the algorithm were well understood by many school leaders. FFT Education Datalab was analysing likely outcomes as early as June 2020, and reported that many hundreds of schools had engaged them to assess their provisional grade submissions, some returning with a revised set of proposed grades for further analysis. Schools were seduced, or reduced, to trying to game the system, feeling they could not change the terrifying and ultimately ridiculous prospect of putting all their many large cohorts of students in strict rank order, subject by subject. Ofqual were victims of groupthink; too many people who should have known better simply let the fiasco unfold. Politicians and Ofqual were obsessed with preventing grade inflation, but – as was widely argued, long in advance –  public confidence depended on broader concerns about the integrity and fairness of the outcomes.

In 2021 we run the same risk of loss of public confidence. If that transpires, the government is positioned to blame teacher assessments and probably reinforce a return to examinations in their previous form, despite their known shortcomings. The consequences of two anomalous years of grading in 2020 and 2021 are still to unfold, but there is an opportunity, if not an obligation, for teachers and schools to develop an alternative narrative.

At GCSE level, schools and colleges might learn from emergency adjustments to their post-16 decisions that there could be better ways to decide on progression beyond GCSE. For A-level/BTEC/IB decisions, schools should no longer be forced to apologise for ‘overpredicting’ A-level grades, which might even become a fairer and more reliable guide to true potential for all students. Research evidence suggests that “Bright students from poorer backgrounds are more likely than their wealthier peers to be given predicted A-level grades lower than they actually achieve”. Such disadvantage might diminish or disappear if teacher assessments became the dominant public element of grading; at present too many students suffer the sometimes capricious outcomes of final examinations.

Teachers’ A-level predictions are already themselves moderated and signed off by school and college heads, in ways which must to some extent resemble the 2021 grading arrangements. There will be at least a two-year discontinuity in qualification levels, so universities might also learn new ways of dealing with what might become a permanently enhanced set of differently qualified applicants. In the longer term HE entrants might come to have different abilities and needs, because of their different formation at school. Less emphasis on preparation for examinations might even allow more scope for broader learning.

A different narrative could start with an alternative account of this year’s grades – not ‘standards are slipping’ or ‘this is a lost generation’, but ‘grades can now truly reflect the potential of our students, without the vagaries of flawed public examinations’. That might amount to a permanent reset of our expectations, and the expectations of our students. Not all countries rely on final examinations to assess eligibility to progress to the next stage of education or employment. By not wasting the current crisis we might even be able to develop a more socially just alternative which overcomes some of our besetting problems of socioeconomic and racial disadvantage.

Rob Cuthbert is an independent academic consultant, editor of SRHE News and Blog and emeritus professor of higher education management. He is a Fellow of the Academy of Social Sciences and of SRHE. His previous roles include deputy vice-chancellor at the University of the West of England, editor of Higher Education Review, Chair of the Society for Research into Higher Education, and government policy adviser and consultant in the UK/Europe, North America, Africa, and China.


Leave a comment

Let them eat data: education, widening participation and the digital divide

by Alex Blower and Nik Marsdin

The quest for an answer

As an education sector we like answers, answers for everything, right or wrong. Sometimes we’re more concerned with arriving at an answer, than we are with ensuring it tackles the issue addressed by the question.

Widening HE participation is led by policy that dictates which answers we provide to what questions and to whom. All too often this leads to practitioners scrambling for answers to questions which are ill fitting to the issue at hand, or looking for a quick solution in such haste that we forget to read the question properly.

The COVID-19 pandemic has once again laid bare the stark inequality faced by children and young people in our education system. With it has been an influx of new questions from policy makers, and answers from across the political and educational spectrum.

A magic ‘thing’

More often than not, answers to these questions will comprise a ‘thing’. Governments like tangible objects like mentoring, tutoring, longer days, boot camps and shiny new academies. All of which align to the good old fashioned ‘fake it till you make it’ meritocratic ideal. For the last 40 years the Government has shied away from recognising, let alone addressing, embedded structural inequality from birth. It’s difficult, it’s complicated, and it can’t readily be answered in a tweet or a soundbite from a 6pm press conference.

The undesirable implications of a search for an ‘oven ready’ answer can be seen in the digital divide. A stark example of what access to the internet means for the haves and have-nots of the technological age.

‘So, the reason young people are experiencing extreme inequality and not becoming educationally successful, is because they don’t have enough access to technological things?’

‘What we need is a nice solid technological thing we can pin our hopes on…’

‘Laptops for everyone!’

Well, (and I suspect some voices in the back know what’s coming) access to technology alone isn’t the answer, in the same way that a pencil isn’t the answer to teaching a child to write.

Technology is a thing, a conduit, a piece of equipment that, if used right, can facilitate a learning gain. As professionals working to widen HE participation, we need to challenge these ‘oven ready answers’. Especially if they seem misguided or, dare I say it woefully ignorant of the challenges working-class communities face.

After distribution of the devices, online engagement didn’t change

Lancaster University developed the ‘Connecting Kids’ project during the first wave of COVID-19, as a direct response to calls for help by local secondary schools. The project achieved what it set out to in that it procured over 500 brand new laptops or Chromebooks, and free internet access for all recipients. Every child who fell outside of the Department for Education scheme who was without a suitable device in the home would now have one. Problem solved, right?

Not quite. Engagement in online learning environments prior to the DfE scheme and Connecting Kids initiative in years 8 and 9 was hovering at about 30% of students engaging daily, and 45% weekly. After the distribution of devices, engagement remained at nearly exactly the same level. Further inspection of the data from the telecom’s provider demonstrated that of the 500 mobile connections distributed, only 123 had been activated. Of those 123 only half were being regularly used. Of the 377 ‘unused’ sim and mi-fi packages around 200 showed ‘user error’ in connection status.

Again, this may come as no surprise to the seasoned professionals working with children and young people at the sharp end of structural inequality, but it turned out the ‘thing’ wasn’t the answer. Who would have thought it?

Understanding communities and providing resources

Fast forward 6 months and monthly interviews with participating school staff (part of the project evaluation, not yet complete) show that online engagement in one school is up to 92%. The laptops have played a valuable role in that. They have enabled access. What they haven’t done however, is understand and make allowances for the circumstances of children, young people and families. That has taken a commitment by the schools to provide holistic wrap around services in partnership with other organisations. It has included short courses on connecting to the internet, and provision of basic learning equipment such as pencils, paper, and pens. It has included the school day and timetable being replicated online, live feedback sessions with teachers and learning assistants, and drop-in sessions for parents and carers. Most importantly, it has included a recognition of the difference between home and school, and the impact it has on the education working-class of young people.

Back to policy and widening participation. If we are to make our work truly meaningful for young people, we must critically engage with a policy narrative which is built around a desire for quick fixes, soundbites and ‘oven ready things’. We owe it to the young people who are being hit hardest by this pandemic to take a step back and look at the wider barriers they face.

To do this we may need to reconceptualise what it means to support them into higher education. This starts with challenging much of the policy that is designed to improve access to higher education built upon a premise of individual deficit. The repetitive waving of magical policy wands to conjure up laptops, mentors and days out on campus will only serve to leave us with ever increasing numbers of students and families who are left out and disengaged. Numbers that will continue to rise unless we take the time to engage critically with the complex, numerous and damaging inequalities that working-class young people face.

Reshaping university outreach

This leaves us with something of a conundrum. As HE professionals, what on earth can we do about all of that? Is it our place to address an issue so vast, and so intimately tied to the turning cogs of government policy and societal inequality?

Well, if recent conversations pertaining to higher education’s civic purpose are anything to go by, the answer is undoubtedly yes. And we need to do it better. Within our mad scramble to do something to support young learners during the first, second, and now third national lockdown, our ‘thing’ has become online workshops.

For many of us the ramifications of the digital divide have been acknowledged, but we have shied away from them in work to widen HE participation. We’ve kept doing what we’ve always done, but switched to a model of online delivery which restricts who has the ability to access the content. Can we honestly say, given the disparity in digital participation amongst the most and least affluent groups, that this is the right answer to the question?

Rather than an online workshop series on ‘choosing universities’, would our time and resource be better spent by organising student ambassadors from computing subjects to staff a freephone helpline supporting young people in the community to get online? Could we distribute workbooks with local newspapers? Could we, as they did at Lancaster, work in partnership with other local and national organisations to offer more holistic support, support which ensures that as many students are able to participate in education digitally as possible?

For us, the answer is yes. Yes we should. And we can start by meaningfully engaging with the communities our universities serve. By taking the time to properly listen and understand the questions before working with those communities to provide an answer.

Currently based at the University of Portsmouth, Dr Alex Blower has worked as a professional in widening access to Higher Education for the last decade. Having completed his doctoral research in education and inequality last year, Alex’s research interests focus around class, masculinity and higher education participation. Follow Alex via @EduDetective on twitter.

Nik Marsdin is currently lead for the Morecambe Bay Curriculum (part of the Eden North Project) at Lancaster University. Nik worked in children’s social care, youth justice and community provision for 12 years prior to moving into HE.  Research interests are widening participation, school exclusion, transitions in education and alternative provision. Follow Nik via @MarsdinNik on Twitter.