srhe

The Society for Research into Higher Education

Ian Mc Nay


Leave a comment

REF outcomes: Some comments and analysis on the Education Unit of Assessment

By Ian McNay

A recent report to the Leadership Foundation for HE (Franco-Santos et al, 2014) showed that ‘agency’ approaches to performance management favoured by managers, which ‘focus on short-term results or outputs through greater monitoring and control’ are ‘associated with …lower levels of institutional research excellence’. A small project I conducted last summer on the REF experience showed such approaches as pervasive for many institutions in the sample. Nevertheless, the main feature of the 2014 REF was the grade inflation over 2008.

Perhaps such outstanding global successes, setting new paradigms, are under-reported, or are in fields beyond the social sciences where I do most of my monitoring. More likely ‘strategic entry policies’ and ‘selectivity in many submissions’, noted by the panel, have shifted the profile from 2008. A second successive reduction of 15 percent in staff numbers submitted, with 15 institutions opting out and only nine coming in since 2008 will also have had an effect. Welcome back to Cardiff, hidden in a broad social sciences submission last time. The gain from Cardiff’s decision is that it will scoop all of the QR funding for education research in Wales. None of the six other institutions making submissions in 2008, with some 40 staff entered, appeared in 2014 results. That may be a result of the disruptions from mergers, or a set of strategic decisions, given the poor profiles last time

What has emerged, almost incidentally, is an extended statement of the objectives/ends of the exercise, which now does not just ‘inform funding’. It provides accountability for public investment in research and evidence of the benefit of this investment. The outcomes provide benchmarking information and establish reputational yardsticks for use within the HE sector and for public information. Still nothing about improving the quality of research, so any such consequence is not built in to the design of the exercise, but comes as a collateral benefit.

My analysis here is only of the Education Unit of Assessment [UoA 25]. More can be found at www.ref.ac.uk .

Something that stands out immediately is the effect of the contribution of scores on environment and impact on the overall UoA profile. Across all submissions, the percentage score for 4* ratings was 21.7 for outputs, but 42.9 for impact and 48.4 for environment.  At the extreme, both the Open University and Edinburgh more than doubled their 4* score between the output profile and the overall profile. Durham, Glasgow and Ulster came close to doing so, and Cardiff and the Institute of Education  added 16 and 18 points respectively to their percentage 4* overall profile.

Seven of the traditional universities, plus the OU, had 100 per cent ratings for environment. All submitted more than 20 FTE staff for assessment – the link between size and a ‘well found’ department is obvious. Several ‘research intensive’ institutions that might be expected to be in that group are not there: Cambridge scored 87.5 per cent. Seven were judged to have no 4* elements in their environment. Durham, Nottingham and Sheffield had a 100 per cent score for impact in their profile, making Nottingham the only place with a ‘double top’.

In England, 21 traditional universities raised their overall score; 6 did not. Among the less research intensive universities, 29 dropped between the two, but there were also seven gains in 4* ratings: Edge Hill went from 2.1 per cent [output] to 9 percent [overall] because of a strong impact profile; Sunderland proved King Lear wrong in his claim that ‘nothing will come of nothing’: they went from zero [output rating] to 5 per cent overall, also because of their impact score. Among the higher scoring modern universities, Roehampton went down from 31 to 20 at 4*, but still led the modern universities with 71 percent overall on the 3*/4* categories.

I assume there was monitoring of inter-assessor comparability, but there appears to have been those who used a 1-10 scale and added a zero, and those who used simple fractions, for both impact and environment. Many do not get beyond 50/50. For output, it is different; even with numbers submitted in low single figures, most scores go to the single decimal point allowed.

For me, one of the saddest contrasts was thrown up by alphabetic order. Staffordshire had an output rating of 25 percent at 4*; Southampton had 18 percent; when it came to overall profile, the rankings were reversed: Staffordshire went down to 16 because it had no scores at 3* or 4* in either impact or environment; Southampton went up to 31 percent. There is a personal element here: in 2001 I was on the continuing education subpanel. In those days there was a single grade; Staffordshire had good outputs – its excellent work on access was burgeoning, but was held down a grade because of concerns about context issues – it was new and not in a traditional department. Some other traditional departments were treated more gently because they had a strong history.

The contribution of such elements was not then quantified, nor openly reported, so today’s openness at least allows us to see the effect.  I believed, and still do,that, like the approach to A level grades, contextual factors should be taken account of, but in the opposite direction. Doing well despite a less advantaged context should be rewarded more. My concern about the current structure of REF grading is that, as in 2001, it does the exact opposite. The message for units in modern universities, whose main focus is on teaching, is to look to the impact factor. A research agenda built round impact from the project inception stage may be the best investment for those who wish to continue to participate in future REFs, if any. There is a challenge because much of their research may be about teaching, but the rules of the game bar impact on teaching from inclusion in any claim for impact. That dismisses any link between the two, the Humboldtian concept of harmony, and any claims to research- led teaching. As a contrast, on the two criticisms outlined here, the New Zealand Performance –Based Research Fund [PBRF] has, among its aims:

  • To increase the quality of basic and applied research
  • To support world-leading research-led teaching and learning at degree and postgraduate level.

Might the impending review of REF take this more integrated and developmental approach?

Reference: Franco-Santos, M., Rivera, P. and Bourne, M.(2014) Performance Management in UK Higher Education Institutions, London, Leadership Foundation for Higher Education

SRHE Fellow Ian McNay is emeritus professor at the University of Greenwich.

Paul Temple


Leave a comment

The student experience in England: changing for the better and the worse?

By Paul Temple

When the present English tuition fee regime was being planned, there were plenty of voices from inside universities warning that it would change the nature of the relationship between students and their universities for the worse. Students would, it was feared, become customers, rather than junior partners in an academic enterprise. Indeed, this was what the Government’s 2011 White Paper, Students at the Heart of the System, seemed to look forward to: “Better informed students will take their custom to the places offering good value for money” (para 2.24) – in other words, they would, it was hoped, act like normal consumers. Has this happened? Continue reading

David Watson


Leave a comment

Professor Sir David Watson 1949-2015

SRHE President 2005-2012

It is with great sadness that the Society announce the death of Professor Sir David Watson, SRHE President from 2005-2012, who passed away on Sunday 8 February after a very short illness.

Throughout his career David sought to develop an understanding of higher education through research. He strongly supported the idea that higher education was a social good. As President of SRHE, David’s annual address to conference (always the best attended keynote) explored the relationship of higher education and society in ways that urged members of the research community to search for evidence on how current policies and practices were affecting society more generally as well as on students and staff in the sector.

David was a staunch supporter of the work of the Society for Research into Higher Education and the Society’s President from 2005 to 2012, standing down at his own insistence, as was typical of David, otherwise we would never have let him go. He felt that his seven years as President (two more than he signed up for) was, in his own words, “enough from me”.

He was wrong of course, as there could never be enough of what David had to say, the way that he said it and the breadth of knowledge, understanding, weight of compassion and sheer humanity that imbued everything he said and wrote about higher education.

We will not see his like again.

His legacy in terms of his written work will remain with us. What his colleagues will remember most is the friendship and support he gave to everyone, the way he touched so many lives and supported so many careers. He believed absolutely in collegiality within the academy and fostering respect between colleagues at all times. In this regard he led by supreme example.

Professor Yvonne Hiller, Professor of Higher Education at the University of Brighton and Chair of the Society during David’s tenure as President was also a personal friend and has summed up for us how we felt about him perfectly:

“Sir David was one of the few truly honest men who combined intellectual prowess with genuine concern and friendship for colleagues. He was particularly self-effacing when encouraging newer researchers to engage with examination of higher education. His genuine warmth for colleagues in the research community was much appreciated by newer and fully established researchers alike”.

Continue reading

Vikki Boliver


Leave a comment

Universities must act collectively to remedy lower offer rates for ethnic minority applicants

By Vikki Boliver

The Runnymede Trust has just launched its publication Aiming Higher: Race, Inequality and Diversity in the Academy which shines a spotlight on ethnic inequalities in UK universities. The report brings together 15 short essays written by academics and policy makers which make clear that radical change is needed to address ethnic inequalities in university admissions, student experiences, degree attainments, graduate labour market outcomes, and access to academic positions especially at senior levels.

In my contribution to the Runnymede publication (see chapter 5) I focus on the issue of ethnic inequalities in university admissions chances. Although British ethnic minorities are more likely to go to university than their White British peers, some ethnic minority groups – notably the Black Caribbean, Black African, Pakistani and Bangladeshi groups – remain strikingly underrepresented in the UK’s most academically selective institutions including Russell Group universities. Of course this is partly due to ethnic inequalities in secondary school attainment which means that members of these groups are less likely to have the high grades required for entry to highly selective universities. But we also know, from analysing university admissions data that British ethnic minority applicants are less likely to be offered places at highly selective universities even when they have the same grades and ‘facilitating subjects’ at A-level as White British applicants. Continue reading

Image of Rob Cuthbert


Leave a comment

Was that a foul REF?

By Rob Cuthbert

The Research Excellence Framework, the UK’s latest version of research quality assessment, reached its conclusion just after the SRHE Research Conference. Publication of the results in mid-December led to exhaustive coverage in all the HE media. 

In the Research Season 2008-2014 the controversy was not so much about who ended up top of the league, but whether the English premier league can still claim to be the best in the world.

Big clubs were even more dominant, with the golden triangle pulling away from the rest and filling the top league positions. But controversy raged about the standard of refereeing, with many more players being labelled world class than ever before. Referees supremo David Sweeney was quick to claim outstanding success, but sponsors and commentators were more sceptical, as the number of goals per game went up by more than 50%.

During the season transfer fees had reached record heights as galactico research stars were poached by the big clubs before the end of the transfer window. To secure their World University League places the leading clubs were leaving nothing to chance. It was a league of two halves. After positions based on research outcomes had been calculated there was a series of adjustments, based on how many people watched the game (impact), and how big your stadium was (environment). This was enough to ensure no surprises in the final league table, with big clubs exploiting their ground advantage to the full. And of course after the end of the season there is usually a further adjustment to ensure that the big clubs get an even bigger share of the funding available. This process, decreed by the game’s governing body, is known as ‘financial fair play’.

Some players had an outstanding season – astronomers were reported to be ‘over the moon’ at the final results, but not everyone was happy: one zoologist confided that he was ‘sick as a parrot’. The small clubs lacked nothing in effort, especially at Northampton, where they responded superbly to their manager’s call to put in 107%. But not everyone can be a winner, research is a results business and as always when a team underperforms, some clubs will be quick to sack the manager, and many more will sack the players.

Scepticism about the quality of the league lingers among the game’s governing body, suspicious about high scoring, and there is a risk that the money from the Treasury will finally dry up. The game may not have finished yet, but some … some people are running onto the pitch, they think it’s all over. It is for now.

Rob Cuthbert is Emeritus Professor of Higher Education Management, University of the West of England, Joint Managing Partner, Practical Academics rob.cuthbert@btinternet.com, Editor, Higher Education Review www.highereducationreview.com, and Chair, Improving Dispute Resolution Advisory Service www.idras.ac.uk

Ian Kinchin


Leave a comment

Sick of Bullet Points

By Ian Kinchin

There is a plague that has infected higher education over the past decade. One that has been so invasive that it has changed the habits of teachers and the expectations of learners in ways that are quite profound, but have gone largely unnoticed. I am talking about the infestation of lectures with bullet points.

It is a hobby-horse of mine, but I am sick of watching bullet points (especially when presenters think it is cool to have them zooming in from the side of the screen, one-by-one), and I regard it as a sickness among our colleagues that needs to be treated.

Students have asked me if university teachers are instructed to read out bullet points during their lectures. I obviously say, ‘no’, to which the students reply, ‘so why do they do it?’. Everyone I speak to tells me that reading points to students in lectures is bad practice, and yet there appears to be a form of pedagogical paralysis that prevents some colleagues from breaking free of this affliction.

There are a number of assumptions that I make in my mind (fairly or unfairly) about presentations that consist of nothing but bullet points. I assume that the presenter lacks the imagination to offer anything other than bullet points (the greasy-spoon mentality of ‘chips with everything’). I assume that the presenter is lazy, and cannot be bothered to present materials in a more engaging way. I assume that perhaps the presenter doesn’t know the content well enough to transform the content into a different format. I presume that the presenter has not been able to construct a coherent schema in his/her mind and so has to work with atomised chunks of content – whilst expecting me to generate some coherence from the presentation.

I even think there is a direct correlation between the extent of bullet point usage in a presentation and the level of teacher dynamism and audience engagement, the “bullet point effect”:

The bullet point effect

PDF of figure:  The bullet point effect

For these reasons, I have all but given up going to keynote lectures at conferences. By the time I get to the third or fourth slide of bullet points I have switched off. And I am not alone. I make a point now of sitting near the back of the audience during these sessions; partly so that I can make a quick getaway if it gets too boring. But more interestingly, so that I can observe the audience to see what they are doing. From the speaker’s perspective, it often looks like the audience is engaged and they are busy typing down the pearls of wisdom on their laptops and tablets. From the back what you see is a sea of screens on which the audience are busy responding to their e-mails.

I was sat in the audience of a keynote a few years ago next to a colleague who knew of my hatred of bullet points. Six slides in and we hadn’t seen anything but bullets. Then the presenter announced the next slide as “the triangle of research”. My ears pricked up and I looked at the screen in anticipation of a geometric depiction of said triangle. What did we get? Three bullet points. To which I said to my colleague, “where’s the triangle?”. His response, “I suppose we have to presume it is in his head – so I wonder who the slide is for?”.

There are exceptions. There are presentations that have been really good. But there seems to be a correlation (at least in my mind) about the quality of the PowerPoint and the quality of the presentation. Some excellent presentations have used PowerPoint, but to show things that cannot be adequately summarised in bullets. Things like graphs, maps and photos. I have even been to presentations where PowerPoint has not been used at all. Imagine! Yes, it can be done. One of the best presentations I have been to recently used a single slide – an image that was the focus of the lecture. Also, colleagues need to remember that PowerPoint can be turned off for part of a presentation – just press ‘B’ on the keyboard and the screen will go black.

I was recently unable to attend a presentation in London. One that sounded potentially very interesting and was to be led by the great and the good. Although in the end I couldn’t go, I was pleased that the organisers would send me copies of the presentations from the day. When the files arrived in my e-mail, I was eager to see what I had missed. Unfortunately, all I got was a lot of bullet points. From this I was unable to determine if there was any coherence or innovation in the ideas that had been presented. It was like seeing the chapter headings from a book. Useless!

One year, a group of us did hatch a plan to produce Tee shirts for a conference with the logo:

“STOP USING BULLET POINTS – THEY RESULT IN NON-LEARNING”.

Perhaps next year, unless a cure is found in the meantime.

Reference

Kinchin, I.M., Lygo-Baker, S. and Hay, D.B. (2008) Universities as centres of non-learning. Studies in Higher Education, 33(1): 89 – 103.

Professor Ian Kinchin is Head of the Department of Higher Education at the University of Surrey, and is also a member of the SRHE Governing Council. This post was first published on Ian’s personal blog, https://profkinchinblog.wordpress.com and is reproduced here with the author’s permission.

Camille Kandiko


Leave a comment

International research on the student experience: Power, methodologies and translation

By Camille Kandiko Howson

The SRHE Student Experience Network featured a discussion at the 2014 SRHE Conference exploring international dimensions of student experience research. We featured three international speakers to provide examples of topics and challenges faced when conducting international and comparative research:

  • Madeleine Kapinga Mutatayi , from Congo DRC, Kinshasa, Phd student Department of Educational Sciences Center for Instructional Psychology and Technology at KU Leuven, Belgium
  • Dr Johanna Annala, Senior Lecturer, School of Education, University of Tampere, Finland
  • Dr Rebecca Schendel, Lecturer, Institute of Education, UCL, UK

The event was chaired by Student Experience Network co-convenor Camille Kandiko Howson with co-convenor Matthew Cheeseman in spiritu.

The Student Experience is growing in importance around the world (Barber 2013), whilst at the same time decreasing in common understanding, shared definitions and research coherence. This may be due to variety of foci of research into the student experience, including:

  • Curricular (learning gains, assessments, breadth and depth) (Douglass et al 2012; Crosling et al 2008)
  • Co-curricular (additional opportunities, such as community engagement, study abroad, and industry collaboration and employability) (Mourshed et al 2012)
  • Extra-curricular (accommodation, lifestyle, sports, societies, politics) (Thomas 2012; UNITE 2014)

These levels are then further compounded by levels of analysis, including individual, group (such as minority groups and international students), institutional (on topics such as governance, engagement and satisfaction), and inter/national (such as access, progression, labour market and rankings). Following the paradox of globalisation, and as countries around the world position higher education in society (such as dropping tuition fees in Germany and dramatically increasing them in the UK), what key issues about the student experience are of relevance across higher education research, beyond national politics and policies?

Two questions were used to stimulate topics for discussion:

  •  What research questions are not being asked about the student experience?
  • What research and evidence could promote productive, effective educational models of higher education?

Across national contexts, the importance of measuring and researching learning strongly emerged. Several delegates raised troublesome issues when Western views of educational research and practice were used in other contexts. This encompassed staff and student perceptions of the learning environment and their relationship to each other, in terms of culturally-bound classroom and research practices. This highlighted power issues between staff and students, particularly the notion of students publically challenging or critiquing staff, even in confidential research settings. A fundamental question arose about different national and cultural interpretations of the concept of student voice, and the need for more comparative work about what student voice means in practice in different political contexts, inside and outside of the university setting.

The challenges of borrowing, or imposing, Western views were also discussed in relation to methodologies, particularly the use of large-scale surveys. The US-based National Survey of Student Engagement offered both the opportunity for comparative research of student engagement and student learning using a validated tool but researchers noted that many of the underlying theories are culturally-bound, such as ‘ideal’ forms of engagement between staff and students and notions of democratic participation, particularly in non-Democratic settings. From East Asia to Africa to the Middle East, different perspectives and modes of interaction were discussed. Related methodological issues were challenges in translating English-based research resources into other languages and cultural settings.

A useful framework picked up from conference presentations was that of using ‘powerful knowledge’ and ‘powerful understanding’ when crossing borders, using international methods and tools and in partnering with colleagues.

This seminar concluded with a desire to share opportunities for collaboration and participation in exploring these areas of research. This entry is a start to this endeavour, welcoming comments and proposals for projects that could carried out in multiple national contexts and engage the international higher education research community. So please comment, share and get in touch with one another!

Dr Camille Kandiko Howson is a research fellow at King’s College London and co-editor of The Global Student Experience: An International and Comparative Analysis

Barber, M., Donnelly, K., and Rizvi, S. (2013) An Avalanche is coming: Higher Education and the Revolution Ahead, London: Institute for Public Policy Research

Crosling, G., Thomas, L. and Heaney, M. (2008) Improving Student Retention in Higher Education- The role of teaching and learning, London: Routledge

Douglass, J. A., Thomson, G., & Zhao, C. M. (2012). The learning outcomes race: the value of self-reported gains in large research universities. Higher Education, 64(3), 317-335.

Staddon, E., & Standish, P. (2012). Improving the student experience. Journal of Philosophy of Education, 46(4), 631–648.

Mourshed, M., Farrell, D. And Barton, D. (2012) Education to Employment: Designing a System that Works, Washington, DC: McKinsey Center for Government

Thomas, L. (2012) Building student engagement and belonging in Higher Education at a time of change: a summary of findings and recommendations from the What Works? Student Retention & Success programme, London: Paul Hamlyn Foundation

UNITE (2014) Living and Learning in 2034- A higher education futures project, University Alliance and UNITE.

Follow

Get every new post delivered to your Inbox.

Join 71 other followers