srhe

The Society for Research into Higher Education


1 Comment

Support for doctoral candidates during the pandemic at the University of Melbourne

by Ai Tam Le

This blog is based on the author’s contribution to a special issue of Studies in Higher Education published online in January 2021. The special issue includes a range of commissioned articles from academics worldwide about their  experiences of Covid19 restrictions in 2020.   Many of the authors featured in the Special issue will be speaking about their contributions at the SRHE Webinar being held on 27 January 2021.

There is little doubt that doing a PhD can be hard; doing a PhD during a pandemic certainly makes it harder. But the diversity of PhD projects, the resources required to undertake them, and the differences in our situations mean that each PhD candidate has faced a rather different set of challenges and experienced varying degrees of disruption due to the pandemic. In my university, the University of Melbourne (henceforth the university), for some students, the pandemic has made little impact on their progress; for others, however, the lack of access to lab facilities or fieldwork means that their project came to a halt.

In a recent paper for a Special Issue of the Studies in Higher Education, I took a closer look at the support provided to doctoral candidates at the university and discussed some of the arising issues. In this blog, I summarise the situation and highlight two major issues with the university’s approach to supporting doctoral students.

University’s support and graduate researchers’ Open Letter

When the pandemic was escalating in March 2020, alongside advising students to work from home, the university gradually introduced different support measures in terms of finance (emergency funds), psychological well-being (counselling services) and candidature (extension of candidature and stipend). Yet the following months saw a rising uneasy sentiment among a group of graduate researchers (masters and doctoral candidates) at the university who then drafted and sent an Open Letter to the university outlining their requests for ‘real’ support. The letter has been signed by more than 640 graduate researchers and academics.

What was requested in the letter? Two major requests were put forward: a special category of leave (a period of non-active enrolment) for reasons related to COVID-19; and a six-month universal extension for all students. Six months was requested because, I suppose, there was an expectation at the beginning of the pandemic that it would take at least six months to be back to ‘normal’. (The reality has proven that this expectation was overly optimistic.) Similarly, the Graduate Student Association at the university at first advocated for three-month universal extensions and later ‘the commitment to six-month extensions as standard or more where needed’.

The university did not respond – and has not, according to my understanding, officially responded –  to the Open Letter; however, their support measures have addressed these concerns to a large extent. Specifically, a new category of leave was created to support students who were not able to continue their research due to the pandemic. Extension of stipend from 3 to 3.5 years was automatically granted to students at a certain stage of their candidature*.  Stipend beyond 3.5 years (up to 26 weeks of extension) can also be requested given that the student can provide sufficient documents to support their case.

The ‘hidden’ issues

It is fair to say that, overall, these measures have accommodated the needs of most students. But there are some ‘hidden’ issues as highlighted below.

First, the university’s ‘business-as-usual’ expectation could create undue pressure for doctoral students. While the university’s commitment to supporting its students to make progress was well-intentioned, the expectation of ‘business-as-usual’ in an unusual time could be misinterpreted as a pressure to work, which was not possible for some students. Failing this expectation can be seen as a sign of weakness by the students, hence creating further stress.

Second, the application process for leave or extension was deemed as bureaucratic by some students. In the application, a student must demonstrate the ‘exceptional circumstance’ under which their research has been disrupted. Some students argued that the existence of the pandemic itself constituted an ‘exceptional circumstance’ that would qualify all students for leave or universal extension. The requirement to demonstrate ‘exceptional circumstance’ was deemed as just another layer of documentation and reporting. Moreover, some changes and disruptions can be documented and quantified such as days of lacking access to lab or fieldwork; other disruptions are not quantifiable such as lack of appropriate workspace or psychological stress due to isolation and the pandemic threat. For example, it would be challenging to quantify and document the loss of productivity due to sitting on a dining chair instead of an ergonomic chair, or due to distractions caused by homeschooling kids. There could be students falling through the cracks by not being able to accumulate ‘sufficient’ documents to support their case for leave or extension.

Looking back, these issues were hidden – acquired by inside knowledge and detail – and seemed trivial compared to the support provided; they can easily be forgotten as time goes by. With the advantage of hindsight, it is now easier to make sense of the contestation between the university’s support and the students’ demands. On the one hand, constrained by financial resources, mostly due to the loss of revenues from international students, the university had to strike a balance between, among other things, organisational survival and doctoral students’ needs. The mechanisms set up – bureaucratic as they could be – were necessary for allocating limited resources for those deemed most in need. On the other hand, research students did not know what situation they were getting into and how or when they were getting out of it, so that any barrier to getting support – regardless of how small it was – added to the existing chaos and would seem significant to the students.

Concluding thoughts

Even though the situation has improved in Australia and some parts of the world, the pandemic is not over yet. No one can confidently say when things are going back to ‘normal’ and how that ‘normal’ would look like. As many PhD candidates are still unable to go back to campus and go about conducting their daily business, the question of how effective these supports are in the long-term remains open.

Moreover, given that research students make a significant contribution to research and development in Australia (ABS 2020), there is a need for further monitoring and reporting on how research students have been affected by the pandemic. This would provide a nuanced understanding of the impact of the pandemic on research workforce capacity in Australia.

*In a recent announcement, the extension has reverted to standard procedures through which students must have approval from their supervisory committee instead of being granted automatically.

Ai Tam Le is a PhD candidate at the Melbourne Centre for the Study in Higher Education and Melbourne Graduate School of Education (University of Melbourne, Australia). Her PhD project explores aspiring academics’ understanding of the academic profession in Australia. She is a contributor to the Early Career Researchers in Higher Education Blog (echer.org). She tweets @aitamlp.


2 Comments

Peer Observation of Teaching – does it know what it is?

by Maureen Bell

What does it feel like to have someone observing you perform in your teaching role? Suppose they tick off a checklist of teaching skills and make a judgement as to your capability, a judgement that the promotions committee then considers in its deliberations on your performance? How does it feel to go back to your department and join the peer who has written the judgement? Peer Observation of Teaching (POT) is increasingly being suggested and used as a tool for the evaluation, rather than collaborative development, of teaching practice.

Can POT for professional development co-exist with and complement POT for evaluation? Or are these diametrically opposed philosophies and activities such that something we might call Peer Evaluation of Teaching (PET) has begun to undermine the essence of POT?

I used to think the primary purpose of peer observation of teaching (POT) was the enhancement of teaching and learning. I thought it was a promising process for in-depth teaching development. More recently I have been thinking that POT has been hijacked by university quality assurance programs and re-dedicated to the appraisal of teaching by academic promotions committees. The principles and outcomes of POT for appraisal are, after all, quite opposite to those that were placed at the heart of the original POT philosophy and approach – collegial support, reflective practice and experiential learning.

In 1996 I introduced a POT program into my university’s (then) introduction to teaching course for academic staff. Participants were observed by each other, and myself as subject coordinator, and were required to reflect on feedback and plan further action. It wasn’t long before I realised that I could greatly improve participants’ experience by having them work together, experiencing at different times the roles of both observer and observed. I developed the program such that course participants worked in groups to observe each other teach and to share their observations, feedback and reflections. A significant feature of the program was a staged workshop-style introduction to peer observation which involved modelling, discussion and practice. I termed this collegial activity ‘peer observation partnerships’.

The program design was influenced by my earlier experiences of action research in the school system and by the evaluation work of Web and McEnerney (1995) indicating the importance of training sessions, materials, and meetings. Blackwell (1996), too, in Higher Education Quarterly described POT as stimulating reflection on and improvement of teaching. Early results of my program, published in IJAD in 2001, reported POT as promoting the development of skills, knowledge and ideas about teaching, as a vehicle for ongoing change and development, and as a means of building professional relationships and a collegial approach to teaching.

My feeling then was that a collegial POT process would eventually be broadly accepted as a key strategy for teaching development in universities. Surely universities would see POT as a high value, low cost, professional development activity. This motivated me to publish Peer Observation Partnerships in Higher Education through the Higher Education Research and Development Society of Australasia (HERDSA).

Gosling’s model appeared in 2002 in which he posed three categories of POT, in summary: evaluation, development, and fostering collaboration. Until then I had not considered the possibility that POT could be employed as an evaluation tool, mainly because to my mind observers did not need a particular level of teaching expertise. Early career teachers were capable of astute observation, and of discussing the proposed learning outcomes for the class along with the activity observed. I saw evaluation as requiring appropriate expertise to assess teaching quality against a set of reliable and valid criteria. Having been observed by an Inspector of Schools in my career as a secondary school teacher, I had learned from experience the difference between ‘expert observation’ and ‘peer observation’.

Looking back, I discovered that the tension between POT as a development activity rather than an evaluation tool had always existed. POT had been mooted as a form of peer review and as a staff appraisal procedure in Australia since the late eighties and early nineties, when universities were experiencing pressure to introduce procedures for annual staff appraisal. The emphasis at that time was evaluative – a performance management approach seeking efficiency and linking appraisal to external rewards and sanctions. Various researchers and commentators c.1988-1993, including Lonsdale, Abbott, and Cannon, sought an alternative approach which emphasised collegial professional development. At that time action research involving POT was prevalent in the school system using the Action Research Planner of Kemmis and McTaggert. Around this time Jarzabkowski and Bone from The University of Sydney developed a detailed guide for Peer Appraisal of Teaching. They defined the term ‘peer appraisal’ as a method of evaluation, that could both provide feedback on teaching for personal development as well as providing information for institutional or personnel purposes. ‘Observer expertise in the field of teaching and learning’ was a requirement.

In American universities various peer-review-through-observation projects had emerged in the early nineties. A scholarly discussion of peer review of teaching was taking place under the auspices of the American Association for Higher Education Peer Review of Teaching project and the national conference, ‘Making Learning Visible: Peer-review and the Scholarship of Teaching’ (2000), brought together over 200 participants. The work of both Centra and Hutchings in the 90s, and Bernstein and others in the 2000s advocated the use of peer review for teaching evaluation.

In 2002 I was commissioned by what was then the Generic Centre (UK) to report on POT in Australian universities. At that time several universities provided guidelines or checklists for voluntary peer observation, while a number of Australian universities were accepting peer review reports of teaching observations for promotion and appointment. Soon after that I worked on a government funded Peer Review of Teaching project led by the University of Melbourne, again reviewing POT in Australian universities. One of the conclusions of the report was that POT was not a common professional activity. Many universities however listed peer review of teaching as a possible source of evidence for inclusion in staff appraisal and confirmation and promotion applications.

My last serious foray into POT was an intensive departmental program developed with Paul Cooper, then Head of one of our schools in the Engineering Faculty. Along with my earlier work, the outcomes of this program, published in IJAD (2013), confirmed my view that a carefully designed and implemented collegial program could overcome problems such as those reported back in 1998 by Martin in Innovations in Education and Teaching International, 35(2). Meanwhile my own head of department asked me to design a POT program that would provide ‘formal’ peer observation reports to the promotions and tenure committee. I acquiesced, although I was concerned that once POT became formalised for evaluation purposes in this way, the developmental program would be undermined.

Around 2008 my university implemented the formal POT strategy with trained, accredited peer observers and reporting templates. POT is now accepted in the mix of evidence for promotions and is compulsory for tenure applications. In the past year I’ve been involved in a project to review existing peer observation of teaching activities across the institution, which has found little evidence of the use of developmental POT.

The Lonsdale report (see above) proposed a set of principles for peer review of teaching and for the type of evidence that should be used in decisions about promotion and tenure: Fairness such that decisions are objective; openness such that criteria and process are explicit and transparent; and consistency between standards and criteria applied in different parts of the institution and from year to year. It always seemed to me that the question of criteria and standards would prove both difficult and contentious. How does a promotions committee decipher or interpret a POT report? What about validity and reliability? What if the POT reports don’t align with student evaluation data? And what does it mean for the dynamics of promotion when one of your peer’s observations might influence your appraisal?

In 2010 Chamberlain et al reported on a study exploring the relationship between annual peer appraisal of teaching practice and professional development. This quote from a participant in the study stays with me, “… the main weakness as far as I’m concerned is that it doesn’t know what it is. Well, what is its purpose?”

POT for professional development is an activity that is collegial, subjective, and reflective. My view is that POT for professional development can only co-exist with a version of POT for evaluation that is re-named, re-framed and standardised. And let’s call it what it really is – Peer Evaluation of Teaching (PET).

Dr Maureen Bell is Editor of HERDSA NEWS, Higher Education Research and Development Society of Australasia; HERDSA Fellow; Senior Fellow, University of Wollongong Australia.

MarciaDevlin


Leave a comment

Making admissions better in Australia

By Marcia Devlin

In Australia, the federal government has been focused on improving the transparency of higher education admissions. I’ve been concerned and written about this matter for some years, particularly the confusion in prospective students and their families around exclusive admissions criteria being used as a proxy for quality.

The government-appointed Higher Education Standards Panel (HESP) were asked to consider and report on how the admissions policies and processes of higher education providers could be made clearer, easier to access and more useful, to inform the choices and decisions of prospective students and their families.

In the context of an increased variety of pathways through which a prospective student can apply or be accepted into higher education in Australia, the HESP found that prospective students, their families and others, including schools, are finding it increasingly difficult to understand the full range of study options and opportunities available, and to understand how they can best take advantage of these options to meet their education and career objectives.

The HESP made 14 recommendations Continue reading

MarciaDevlin


Leave a comment

Australian HE reform could leave students worse off

By Marcia Devlin

Australia is in full election campaign mode. What a returned conservative government means for higher education is a little worrying, although what a change of government means is worrying for different reasons.

Two years ago, the then federal Minister for Education, Christopher Pyne, proposed a radical set of changes for higher education funding including, among other things, a 20% cut to funding and full fee deregulation. While the latter received support from some institutions and Vice-Chancellors, there were very few supporters of the whole package. Among those who did not support it were the ‘cross-benchers’ – the independent and minor party members of the Parliament of Australia who have held the balance of power since elected in 2014 – and so the proposals were not passed.

The government have since introduced Senate voting reforms which means the minor parties will not be able to swap preferences in order to secure Senate seats as they have done in the past, and there is less likelihood of a future cross bench like this one. Which is a shame for higher education in my view as these folk actually listened to the sector and public and responded accordingly. Mr Pyne has now moved onto other responsibilities. But just before he moved, this actually happened: https://www.youtube.com/watch?v=Hc9NRwp6fiI

The new and current Education Minister, Simon Birmingham has released a discussion paper in lieu of budget measures: Continue reading