Using student feedback to make university-directed learning on placement more engaging

Anjali Mehta Chandar: a.m.chandar@reading.ac.uk

Charlie Waller Institute, School of Psychology and Clinical Language Sciences

 

Overview

Our vocational postgraduate courses in Cognitive Behavioural Therapy include University Directed Learning (UDL) days that are completed within the placement setting (e.g. their NHS trust). A qualitative student feedback survey allowed us to collaboratively adapt this format, with favourable outcomes in how interesting, enjoyable and useful the students found the day.

Objectives

Our objectives were as follows:

-To ascertain how interesting, enjoyable and useful the UDL days were, as perceived by the students, based on pedagogical findings that students engage best and are most satisfied, if these characteristics are met (e.g. Ramsden, 2003).

-To make improvements to the UDL days based on qualitative student feedback.

-To ascertain whether our improvements had made the UDL days more interesting, enjoyable and useful, as perceived by the next cohort of students.

Context

The Educational Mental Health Practitioner (EMHP) and Children’s Wellbeing Practitioner (CWP) programmes are one-year vocational postgraduate courses. The students are employed by an NHS trust, local authority or charity, and study at UoR to become qualified mental health practitioners.

UDL days make up a small proportion of the teaching days. They are self-guided teaching days, usually containing elements of e-learning, designed to complement and consolidate face to face teaching (live or remote). A combination of learning methods, including e-learning, is shown to be effective in increasing clinical skills (e.g. Sheikhaboumasoudi et al., 2018).

UDL days had been poorly received by our two 2019/2020 cohorts, according to feedback in the student rep meetings and Mentimeter feedback after each UDL e.g.  comments included: ‘there was too much [content] for one day’, ‘I felt pressured to fill [the form] out rather than focussing on the readings themselves’ and ‘[the reflective form] was too long and too detailed’. Whilst this gave us some ideas on changes to make, I was aware of the low completion rates of the Mentimeter feedback. Therefore, to hear from more voices, we decided to create a specific feedback survey about the UDLs to help us make amendments in a collaborative way.

Implementation

We started by creating a survey for the current students to ascertain their views on how interesting, enjoyable and useful the UDL days were. We also had qualitative questions regarding what they liked and disliked and ideas for specific improvements.

I then led a meeting with my course team to explore the key findings. We agreed to make several changes based on the specific feedback, such as:

– variety of activities (not purely e-learning, but roleplays, videos, self-practice self-reflection tasks, group seminars run by lecturers, etc, to provide a more engaging day)
– fewer activities (we aimed for one main activity for every 1-1.5 hours to manage workload demands)
– an option to complete the programme’s reflective form (designed to be more simple, by asking them to provide their own notes on each task) or provide their notes in a format of their choice (e.g. mindmaps, vlogs, etc) to increase accessibility.
– share these reflections on a discussion board for other students and the lecturer to comment on.

We were unable to implement these changes to the current cohort as they had finished all their UDL days in the timetable, so made the changes for the following cohorts in 2020/2021.

We then sought their feedback via a new survey to ascertain their views on how interesting, enjoyable and useful the UDLs are, with additional questions relating to specific feedback on the new elements.

Impact

The survey results for the newer cohorts were much more positive than the original cohort, after changes were made to the UDL format.

There was a significant increase in how interesting, enjoyable and useful the students found the days.

The trainees also largely agreed that the UDLs had an appropriate workload, e.g. one task per 1-1.5 hours.

They also largely agreed that UDLs included interactive and varied tasks. This finding is in contrast to some of the aforementioned literature of the importance of e-learning, and it must be remembered that too much e-learning can be less engaging for trainees.

The students also praised the simple reflective form as a helpful tool, and many appreciated the option to submit notes in their own preferred way.

Although we neglected to explore the role of the lecturer feedback in the new UDL survey, research shows that this makes for a more engaging e-learning session (Dixson, 2010), and may explain why the UDLs were now more favourable.

Moreover, the process of collecting data from the students via a feedback form seemed effective, in that we used feedback to adapt the specific teaching method, thus improving student satisfaction. Pedagogical research shows the importance of using qualitative questions (instead of, or as well as, quantitative methods) to elicit student feedback (Steyn et al., 2019).

Reflection

Overall, this redesign was successful, which may be down to the fact we used the student voice to make meaningful changes. This is in line with Floden’s (2017) research that student feedback can help to improve courses.

Furthermore, the changes we have made are in line with effective practice amongst other courses and universities, e.g. appropriate workload (Ginn et al., 2007), student choice of discussion format (Lin & Overbaugh, 2007), accessibility of resources (Mahmood et al., 2012) and lecturer interaction (Dixson, 2010).

There is a possible limitation in this case study, in that our more recent cohorts are generally happier on the course, and therefore may be more positive about the UDL. In future projects, it would be useful if we can notice themes within module evaluation/student rep meetings earlier, to then elicit specific survey feedback earlier in the course and make amendments sooner, allowing feedback from the same cohort.

In future variations of the survey, I would also wish to explicitly ask how trainees find sharing reflections on the Blackboard discussion groups, as this is one change we had not elicited feedback on.

Follow Ups

We have continued to utilise these changes in the UDL format with future cohorts,  e.g. reduced workload, variety of activities, simplified forms, choice of discussion format and lecturer interaction. We no longer receive concerns about these days in the student rep meetings since the original cohort. The Mentimeter feedback at the end of each UDL is generally positive, with one person recently commenting: ‘this was a very engaging day’.

References

References:

Dixson, M. D. (2010). Creating effective student engagement in online courses: What do students find engaging?. Journal of the Scholarship of Teaching and Learning, 1-13.

Flodén, J. (2017). The impact of student feedback on teaching in higher education. Assessment & Evaluation in Higher Education42(7), 1054-1068.

Ginns, P., Prosser, M., & Barrie, S. (2007). Students’ perceptions of teaching quality in higher education: The perspective of currently enrolled students. Studies in higher education32(5), 603-615.

Lin, S. Y., & Overbaugh, R. C. (2007). The effect of student choice of online discussion format on tiered achievement and student satisfaction. Journal of Research on technology in Education39(4), 399-415.

Mahmood, A., Mahmood, S. T., & Malik, A. B. (2012). A comparative study of student satisfaction level in distance learning and live classroom at higher education level. Turkish Online Journal of Distance Education13(1), 128-136.

Ramsden, P. (2003). Learning to teach in higher education. Routledge.

Sheikhaboumasoudi, R., Bagheri, M., Hosseini, S. A., Ashouri, E., & Elahi, N. (2018). Improving nursing students’ learning outcomes in fundamentals of nursing course through combination of traditional and e-learning methods. Iranian journal of nursing and midwifery research, 23(3), 217.

Steyn, C., Davies, C., & Sambo, A. (2019). Eliciting student feedback for course development: the application of a qualitative course evaluation tool among business research students. Assessment & Evaluation in Higher Education, 44(1), 11-24.

Links

CWI website: https://sites.reading.ac.uk/charlie-waller-institute/

The One Where a Timetable Merger Gives Rise to a Curriculum Implementation Review

Emma-Jayne Conway, James Kachellek and Tamara Wiehe

t.wiehe@reading.ac.uk

Link back to case studies on the T and L Exchange website

Overview

Staff and students in CWI collaborated on a project initially designed to merge two timetables of sister programmes to aid cross programme working (objective 1) but gave rise to the perfect opportunity to review the way our PWP curriculum is implemented following the pandemic (objective 2). In this blog, we reflect on achieving both objectives within our original timeframe!

Objectives

1.     To create a single timetable to aid cross-programme working for teaching and administrative staff.

2.     To review curriculum implementation including structure and modality on a modular and programme level with all key stakeholders.

Context

In response to a departmental restructure, we required more efficient ways of working across programmes starting with a uniform timetable. Early on, the project evolved to also review the structure and modality of the curriculum. Our two sister PWP training programmes (one undergraduate and one postgraduate) are virtually identical with a few exceptions but historically had been managed separately.

Over the course of 2021, we planned, designed, and implemented a timetable merger for our September cohorts. This impacted on 3 modules (4 for undergraduates) that form the PWP training year for the MSci Applied Psychology (Clinical) students and the Postgraduate/graduate Certificate in Evidence-Based Psychological Treatments (IAPT Pathway).

Taking both Higher Education and Mental Health Care processes into consideration was no easy feat, including those specific to University of Reading (e.g., term dates), our national PWP curriculum specifying the content and learning outcomes for our 26 teaching days and 19 study days, and British Psychological Society (BPS) accreditation requirements. Modality was a particularly important topic throughout this project, taking key learnings from remote delivery during the pandemic as well as awaiting guidance from our professional accrediting body.

Overall, it served as an excellent opportunity to work collaboratively with staff and students to review the implementation of PWP training at the University of Reading.

Implementation

  1. Early 2021: The PWP team met on several occasions to discuss the possibility of merging the two timetables, including transitioning to a “blended” format of online and face-to-face teaching post-Covid. We set out a timeline for planning, designing, and implementing the project.
  2. Advice was sought from the Director of Training in CWI and colleagues in Academic Development, CQSD based on their experience of timetable mergers and a green light was given based on our draft plans!
  3. Several options were considered before the final format was arrived at: Face-to-face teaching is weighted towards the first module/term with progressive increase to the online taught content as the course progresses. (Rationale supplied elsewhere in this blog).
  4. The educator team were able to draw on feedback from online teaching to gauge the attitude of the student body to online learning, as well as expectations and concerns related to a return to the classroom (see Impact, below). The student voice was important in terms of utilising partnership to create meaningful change to curriculum implementation. However, engaging professional practice students via the course reps was a challenge due to time constraints, therefore, we were able to engage graduate instead. This is something we would consider earlier on in future projects.
  5. The educator team unanimously agreed that the externally taught content of the VEC module could be effectively taught with mixed cohorts from the Core-PWP and MSci cohorts using an online approach.
  6. Information on the changes was disseminated to Program Administrators to enable efficient implementation. External Educators were made aware of the retention of online lecture sessions, and the mixed-cohort approach, by the VEC module convenor.
  7. Timetables were updated by the Program Director, in collaboration with Module Convenors; consideration has been given to the potential Workflow impact of closely aligning multiple cohorts (see below). Timetables have been looked at by the team ‘side-by-side,’ to ensure that Workflow balance is maintained for educators across all cohorts. We can continue to monitor the impact on workload while adjustments are made to teaching (such as with the Working Document mentioned in the Follow-Up section, below).
  8. IAPT Services were made aware of the changes to the timetables

Impact

As of October 2021, the merged timetables are proving effective, with no major barriers, having been detected. Predicted barriers included those to effective teaching of (previously face-to-face) content, student/staff dissatisfaction with a blended approach, and significant administrative/technical difficulties.

Face-to-Face teaching resumed in September 2021 and has been a successful return to the classroom. Educators report being able to switch between live sessions and face-to-face teaching days without significant barriers.

The educator team plan to continue to gather feedback on the student experience of the blended and merged approach. We will be able to assess feedback when the first cohorts fully complete in June 2022.

Feedback will be sought from module convenors, educators, and program administrators using “menti” feedback forms, bi-weekly team meetings and informal qualitative discussion, to gauge the impact of the changes on workflow. Student feedback will also be monitored through end-of-module feedback collated by the School.

Reflection

  • The challenge of engaging professional practice students and utilising graduates to overcome this. We will consider setting up graduate/patient advisory group for future projects.
  • Using feedback from a MSci graduate led to timetable changes to ensure readability and clarity for students. This included points such as colour coding F2F v online teaching days, explaining acronyms, etc.
  • Involving all members of the team (especially Module Convenors) felt like a much more meaningful and collaborative process than Programme Director decisions alone. It gave Module Convenors autonomy over their modules as well as aligning learning outcomes across the 3 modules of the programme which is particularly important for clinical training. Other courses may wish to replicate this approach to build team cohesion and allow all colleagues to make meaningful contributions to programme changes and delivery.

Follow up

  • Working document has been created for the educator team to comment on the teaching they have just delivered i.e., was there enough time to deliver all content? This has allowed changes to be made within a matter of weeks as the same day is delivered across the programmes. As a result, we can fine-tune the timetable and delivery of the programme quicky and efficiently to improve the student experience.
  • We will review module by module and at the end of each cohort to continue making any necessary adjustments. Module and programme evaluations, student-staff rep meetings and any feedback from individual teaching days will also help to inform this.

 

Driving programme development in the IOE: student focus groups and paper writing

Jo Anna Reed Johnson – Institute of Education

j.a.reedjohnson@reading.ac.uk

Link back to case studies on the T and L Exchange website

Overview

This article outlines the thinking to drive programme development through student focus groups across three IOE programmes.  The outcome to write a paper and present at a conference helped me to frame this project with a team of academics focusing on changes made during Covid-19 (2020-2021).  This article will share reflections on setting up and running of the focus groups, the delivery of the conference presentation and the final paper writing.  Finally, it will discuss what we have learnt from this and what we will continue to do.

Objectives

  • Share 4 academic perspectives on the redesigning of three modules (SKE, STEAM, PGCE Sec Science) that all have practical elements (laboratory or school), due to Covid-19, by sharing what we did and exploring the student perspectives
  • Show how we designed opportunities for discussion and collaboration when conducting practical work or school related work online
  • Consider the use of student focus groups for programme co-development
  • Reflect on the collaborative nature of paper writing and co-programme reflections

Context

At the IOE there are a range of teacher education programmes, with a practical focus.  The four colleagues engaged in this article were involved with Skills in Schools (ED2TS1 – March to July 2020), SKE (IESKEP and PFTZSKEMATHSA– March to Aug 2020) and PGCE Secondary Science (GFSTSCIENCE – September 2020 to June 2021).  These programmes all require students to work in schools and engage in a science laboratory (if science focused).  As COVID hit in March 2020 we had to think quickly and imaginatively, transforming our provision to be online where required.  Having worked across all three programmes I felt it was pedagogically appropriate to engage our students in the ways we had throughout their learning during the pandemic, where they worked in online communities of practice to reflect.  Thus, we decided to set up a series of focus groups with students reflecting on the impact of the changes and to provide insights for future programme innovations.  This culminated in a conference presentation and paper.

Implementation

The focus was to drive programme development through reflections and shared experiences of academics and students.  I set up a project timeline and MS Team to manage and drive the deliverables, with the end goal to engage students as co-programme developers and to culminate in a conference presentation and paper.  It required framing the project, seeking ethical approval and funding, setting up focus groups to collect data, then reflections and writing up.

Framing the project allowed me to maintain the focus for the redesigning of three modules that all had practical elements (laboratory or school), due to Covid-19.  And then exploring how that had impacted on students through focus groups. It was the conference and paper deadlines that drove this activity and timeline.  At first colleagues wondered why we were writing a paper for a submission related to the School of Architecture (Manchester and Manchester Metropolitan University), but in fact it was because it was about ‘place’.  The remit was a paper related to ‘online education: teaching in a time of change’.

Seeking ethical approval and funding all required knowing where to go and what to do.  Ethical approval required submission of an ethical approval form (including consent form, interview schedule, focus group details) to the IOE ethics committee.  Then applying for funding through the University Travel Grants Scheme – Tasha Easton – e.saxon@reading.ac.uk

Data Collection was initially carried out using MS Forms, for the initial feedback request.  Consent was also required, so where this could not be achieved in person, there was a need to have consent approval attached to the top of the MS Form.  Once participants had consented and those who were willing had indicated taking part in the focus groups, I could set up a series of focus groups across the three programmes, to take place on MS Teams.  We decided to split the four sets of interviews into subject specific groups so that the conversations and reflections could be driven by the students.  One student was nominated as the chair, and they had a template of questions to guide their discussions.

Paper Writing was a challenge as we needed to fit this around our Teaching Focused roles.  I created a writing template after attending an IOE Research and Scholarship Writing Workshop with Professor Alan Floyd.  I scheduled meetings to review, discuss and allocate sections of writing.

The whole process began in December 2020 and continued through to 30 May 2021, with the conference in 21-23 April 2021 (July 2021- Paper Publication).

 

Impact

There were several elements of impact:

  • Working collaboratively with colleagues to reflect on programme development
  • Engaging students as co-programme developers
  • Attending a conference (where funding allowed)
  • Conference paper presentation
  • Conference paper publication

Reflection

In terms of the setting up of focus groups and driving data collection, we learnt that we needed to be organised, and the timeline/plan really helped to keep that focus.  There were times where we were too busy, but we had to create time as we had deliverables to meet.  If we had not have had those deliverables of a conference presentation and paper, we may have let this slip and do it ‘next year’.

Writing the paper was a challenge in that we had not done this together before, and some colleagues had not written an academic paper in a very long time, or even an educational one.  So, creating that writing template and allocating tasks worked.

Gaining conference funding can always be a challenge.  But reaching out and asking was the first thing to do. Then finding out what could be offered at the University/School Level.  Next time, we would all like to attend the conference.  Being an online conference made it more difficult to engage, and I think next time we would plan to all get funding an attend a face-to-face conference so that we too can benefit from being part of the Community of Practice.

What we will continue to do….

  • Develop students as programme co-developers through focus groups, engaging them in the paper writing.
  • Use focus groups to help us (academics) reflect on our own practice and discuss developments across programmes.
  • Drive programme development through the sharing of practices, building communities of practice with timelines and deliverables.

What else will we do…

  • Engage students in the paper writing and conference process.
  • Seek funding to attend a F2F conference with colleagues to allows us time and space to continue to reflect on practice.

Links

Research and Travel Grants Committee: https://www.reading.ac.uk/closed/research/committees/res-researchtravelgrantsubcommittee.aspx

AMPS Conference 21-23 April 2021 – https://architecturemps.com/online-ed-conference/

Online Delivery of Practical Learning Elements for SKE

Jo Anna Reed Johnson, Gaynor Bradley, Chris Turner

j.a.reedjohnson@reading.ac.uk

Overview

This article outlines the re-thinking of how to deliver the science practical elements of the subject knowledge enhancement programme (SKE) due to the impact of Covid-19 in March 2020 and a move online.  This focuses on what we learnt from the practical elements of the programme delivery online as it required students to engage in laboratory activities to develop their subject knowledge skills in Physics, Chemistry and Biology related to school national curriculum over two weeks in June and July 2021.  Whilst there are some elements of the programme we will continue to deliver online post Covid-19, there are aspects of practical work our students would still benefit from hands on experience in the laboratory, with the online resources enhancing that experience.

Objectives

  • Redesign IESKEP-19-0SUP: Subject Knowledge Enhancement Programmes so it was COVID-safe and fulfilled the programme objectives in terms of Practical work
  • Redesign how students could access the school science practical work with no access to laboratories. This relates to Required Practical work for GCSE and A’Levels
  • Ensure opportunities for discussion and collaboration when conducting practical work
  • Provide students with access to resources (posted and shopping list)

Review student perspectives related to the response to online provision

Context

In June 2020 there was no access to the science laboratories at the Institute of Education (L10) due to the Covid pandemic. The Subject Knowledge Enhancement Programme (SKE) is a pre-PGCE requirement for applicants who want to train to be a teacher but may be at risk of not meeting the right level of subject knowledge (Teacher Standard 3) by the end of their PGCE year (one-year postgraduate teacher training programme).  We had 21 SKE Science (Physics, Chemistry, Biology) students on the programme, 3 academic staff, one senior technician and two practical weeks to run.  We had to think quickly and imaginatively.  With a plethora of school resources available online for school practical science we set about reviewing these and deciding what we might use to provide our students with the experience they needed.  In addition, in terms of the programme content for the practical weeks we streamlined our provision, as working online requires more time.

Implementation

In May 2020, the senior science technician was allowed access to the labs.  With a lot of work, she prepared a small resource pack with some basic equipment that was posted to each student.  This was supplemented with a shopping list that students would prepare in advance of the practical weeks.  For the practical week programme, we focused on making use of free videos that are available on YouTube and the Web (https://www.youtube.com/watch?v=jBVxo5T-ZQM

and https://www.youtube.com/watch?v=SsKVA88oG-M&list=PLAd0MSIZBSsHL8ol8E-a-xgdcyQCkGnGt&index=12).

Having been part of an online lab research project at the University of Leicester I introduced this to students for simulations along with PHET (https://www.golabz.eu/ and https://phet.colorado.edu/).

 

 

 

 

 

 

 

We also wanted students to still feel some sense of ‘doing practical work’ and set up the home labs for those topics we deemed suitable e.g. heart dissection, quadrats, making indicators.

 

 

Powerpoints were narrated or streamed.  We set up regular meetings with colleagues to meet in the morning before each practical day started.  We did live introductions to the students to outline the work to be covered that day.  In addition, we organised drop-in sessions such as mid-morning breaks and end of day reviews with the students for discussion.  Throughout, students worked in groups where they would meet, discuss questions and share insights.  This was through MS Teams meetings/channels where we were able to build communities of practice.

Impact

End of programme feedback was received via our usual emailed evaluation sheet at the end of the practical weeks and end of the programme.  5/21 responses.  The overall feedback was that students had enjoyed the collaboration and thought the programme was well structured.

I particularly enjoyed the collaborative engagement with both my colleagues and our tutors. Given that these were unusual circumstances, it was important to maintain a strong team spirit as I felt that this gave us all mechanisms to cope with those times where things were daunting, confusing etc but also it gave us all moments to share successes and achievements, all of which helped progression through the course. I felt that we had the right blend of help and support from our tutors, with good input balancing space for us to collaborate effectively.’

Student Feedback initial evaluation

‘I enjoyed “meeting” my SKE buddies and getting to know my new colleagues. I enjoyed A Level Practical Week and found some of the online tools for experimentation and demonstrating experiments very helpful’

Student Feedback initial evaluation

To supplement this feedback, as part of a small-scale scholarly activity across three programmes, we also sent out a MS Form for students to complete to allow us to gain some deeper insights into the transition to online learning.  22/100 responses.  For example, the students’ excitement of doing practical work, and the experience of using things online that they could then use in their own teaching practice:

‘…the excitement of receiving the pack of goodies through the post was real and I enjoyed that element of the program and it’s been genuinely useful. I’ve used some of those experiments that we did in the classroom and as PERSON B said virtually as well.’

Student Feedback MS Form

‘…some of the online simulators that we used in our SKE we’ve used. I certainly have used while we’ve been doing online learning last half term, like the PHET simulators and things like that…’

Student Feedback MS Form

The students who consented to take part engaged in four discussion groups (5 participants per group = 20 participants).  These took place on MS Teams and this once again highlighted the benefits of the online group engagement, as well as still being able to meet the programme objectives:

‘I just wanted to say, really. It was it was a credit to the team that delivered the SKE that it got my subject knowledge to a level where it needed to be, so I know that the people had to react quickly to deliver the program in a different way…’

Student Feedback Focus Group

There was some feedback that helped us to review and feedback into our programme development such as surprise at how much independent learning there is on the programme, and the amount of resources or other materials (e.g. exam questions).

Reflections

We adopted an integrated learning model:

 

We learnt that you do not have to reinvent the wheel.  With the plethora of tools online we just needed to be selective… asking ‘did the tool achieve the purpose or learning outcome?’.  In terms of planning and running online activities we engaged with Gilly Salmon’s (2004) 5 stage model of e-learning.   This provides structure and we would apply this to our general planning in the use of Blackboard or other TEL tools in the future.  We will continue to use the tools we used.  These are useful T&L experiences for our trainees as schools move more and more towards engagement with technology.

However, it was still thought by the students that nothing can replace actually practical work in the labs:

‘I liked the online approach to SKE but feel that lab work face-to-face should still be part of the course if possible. There are two reasons for this: skills acquisition/ familiarity with experiments and also networking and discussion with other SKE students.’                                                                                      Student Feedback MS Form

Where possible we will do practical work in the labs but supplement these with the online resources, videos and simulation applications.  We will make sure that the course structure prioritises the practical work but also incorporates aspects of online learning.

We will continue to provide collaborative opportunities and engage students online for group work and tutorials in future years.  We also found the ways in which we collaborated through communities of practice, on MS Teams Channels, was very effective.  We set up groups, who continued to work in similar ways throughout the course, who were able to share ideas by posting evidence, then engage in a discussion.  Again, this is something we will continue to do so that when our students are dispersed across the school partnership, in different locations, they can still be in touch and work on things collaboratively.

Links and References

Online Lab case Studies

https://www.golabz.eu/

https://phet.colorado.edu/

Practical week Free Videos

https://www.youtube.com/watch?v=jBVxo5T-ZQM

https://www.youtube.com/watch?v=SsKVA88oG-M&list=PLAd0MSIZBSsHL8ol8E-a-xgdcyQCkGnGt&index=12).

The impact of COVID upon practical classes in Part 1 chemistry – an opportunity to redevelop a core module

Philippa Cranwell p.b.cranwell@reading.ac.uk, Jenny Eyley, Jessica Gusthart, Kevin Lovelock and Michael Piperakis

Overview

This article outlines a re-design that was undertaken for the Part 1 autumn/spring chemistry module, CH1PRA, which services approximately 45 students per year. All students complete practical work over 20 weeks of the year. There are four blocks of five weeks of practical work in rotation (introductory, inorganic, organic and physical) and students spend one afternoon (4 hours) in the laboratory per week. The re-design was partly due to COVID, as we were forced to critically look at the experiments the students completed to ensure that the practical skills students developed during the COVID pandemic were relevant for Part 2 and beyond, and to ensure that the assessments students completed could also be stand-alone exercises if COVID prevented the completion of practical work. COVID actually provided us with an opportunity to re-invigorate the course and critically appraise whether the skills that students were developing, and how they were assessed, were still relevant for employers and later study.

Objectives

• Redesign CH1PRA so it was COVID-safe and fulfilled strict accreditation criteria.
• Redesign the experiments so as many students as possible could complete practical work by converting some experiments so they were suitable for completion on the open bench to maximise laboratory capacity
• Redesign assessments so if students missed sessions due to COVID they could still collect credit
• Minimise assessment load on academic staff and students
• Move to a more skills-based assessment paradigm, away from the traditional laboratory report.

Context

As mentioned earlier, the COVID pandemic led to significant difficulties in the provision of a practical class due to restrictions on the number of students allowed within the laboratory; 12 students in the fumehoods and 12 students on the open bench (rather than up to 74 students all using fumehoods previously). Prior to the redesign, each student completed four or five assessments per 5-week block and all of the assessments related to a laboratory-based experiment. In addition, the majority of the assessments required students to complete a pro-forma or a technical report. We noticed that the pro-formas did not encourage students to engage with the experiments as we intended, therefore execution of the experiment was passive. The technical reports placed a significant marking burden upon the academic staff and each rotation had different requirements for the content of the report, leading to confusion and frustration among the students. The reliance of the assessments upon completion of a practical experiment was also deemed high-risk with the advent of COVID, therefore we had to re-think our assessment and practical experiment regime.

Implementation

In most cases, the COVID-safe bench experiments were adapted from existing procedures, allowing processing of 24 students per week (12 on the bench and 12 in the fumehood), with students completing two practical sessions every five weeks. This meant that technical staff did not have to familiarise themselves with new experimental procedures while implementing COVID guidelines. In addition, three online exercises per rotation were developed, requiring the same amount of time as the practical class to complete therefore fulfilling our accreditation requirements. The majority of assessments were linked to the ‘online practicals’, with opportunities for feedback during online drop-in sessions. This meant that if a student had to self-isolate they could still complete the assessments within the deadline, reducing the likelihood of ECF submissions and ensuring all Learning Outcomes would still be met. To reduce assessment burden on staff and students, each 5-week block had three assessment points and where possible one of these assessments was marked automatically, e.g. using a Blackboard quiz. The assessments themselves were designed to be more skills-based, developing the softer skills students would require upon employment or during a placement. To encourage active learning, the use of reflection was embedded into the assessment regime; it was hoped that by critically appraising performance in the laboratory students would remember the skills and techniques that they had learnt better rather than the “see, do, forget” mentality that is prevalent within practical classes.

Examples of assessments include: undertaking data analysis, focussing on clear presentation of data; critical self-reflection of the skills developed during a practical class i.e. “what went well”, “what didn’t go so well”, “what would I do differently?”; critically engaging with a published scientific procedure; and giving a three-minute presentation about a practical scientific technique commonly-encountered in the laboratory.

Impact

Mid-module evaluation was completed using an online form, providing some useful feedback that will be used to improve the student experience next term. The majority of students agreed, or strongly agreed, that staff were friendly and approachable, face-to-face practicals were useful and enjoyable, the course was well-run and the supporting materials were useful. This was heartening to read, as it meant that the adjustments that we had to make to the delivery of laboratory based practicals did not have a negative impact upon the students’ experience and that the re-design was, for the most part, working well. Staff enjoyed marking the varied assessments and the workload was significantly reduced by using Blackboard functionality.

Reflections

To claim that all is perfect with this redesign would be disingenuous, and there was a slight disconnect between what we expected students to achieve from the online practicals and what students were achieving. A number of the students polled disliked the online practical work, with the main reason being that the assessment requirements were unclear. We have addressed by providing additional videos explicitly outlining expectations for the assessments, and ensuring that all students are aware of the drop-in sessions. In addition, we amended the assessments so they are aligned more closely with the face-to-face practical sessions giving students opportunity for informal feedback during the practical class.

In summary, we are happy that the assessments are now more varied and provide students with the skills they will need throughout their degree and upon graduation. In addition, the assessment burden on staff and students has been reduced. Looking forward, we will now consider the experiments themselves and in 2021/22 we will extend the number of hours of practical work that Part 1 students complete and further embed our skill-based approach into the programme.

Follow up

 

Links and References

Considering wellbeing within the placement module assessment

Allán Laville (Dean for D&I and Lecturer in Clinical Psychology) and Libby Adams (Research Assistant), SPCLS

Overview

This project aimed to design a new alternative assessment to form a part of the MSci Applied Psychology course which puts emphasis on the practical sides of training as a Psychological Wellbeing Practitioner (PWP). This included utilising problem-solving skills and wellbeing strategies.

Objectives

  • This project was funded by SPCLS Teaching & Learning Enhancement Fund and aimed to design an alternative assessment to be used as a part of the MSci Applied Psychology course to support student wellbeing.
  • The project aimed to incorporate an assignment into the curriculum which provides students with transferable problem-solving and wellbeing management strategies which can be used in future mental health support/clinical roles.

Context

The above project was undertaken as within IAPT, Psychological Wellbeing Practitioners (PWPs) are required to work in a fast-paced environment seeing multiple patients back-to-back throughout the day. Students on the MSci Applied Psychology course are required in their third year to undertake a work placement 1 day a week in the first term increasing to 2 days a week in the second term. Students are also required to undertake 1 full day of training per week. The aim of the project was to embed an assignment which focusses on managing wellbeing within the curriculum.

Implementation

Allán Laville (Dean for Diversity and Inclusion) brought to light the concept of incorporating wellbeing within the curriculum and contacted Libby Adams (Part 4 MSci Student) to see whether she would take part in the development of the new assessment. Libby Adams was included here as she previously trained as a Psychological Wellbeing Practitioner and first-hand experienced challenges managing the demands of the PWP role as a trainee and in turn managing her wellbeing.

Libby Adams’ experience

The project was developed with my own challenges in mind, to build upon this we then met with current and past MSci students to gain insight into the challenges they faced. We were then able to condense information and incorporate them within our concept of a wellbeing blog. We then considered how we could problem-solve ways around the areas that could not be included in the blog. At the second stage we met with clinical staff and educators to share our idea and gain feedback on the feasibility of implementation within IAPT services. The final project design was then formed with the above feedback in mind.

Impact

Views from current MSci students on the benefits of the project:

“I think maintaining our own wellbeing is such a critical part of caring professions, and I think that making it a clear and mandatory part of the course you’re not only helping students look after themselves for this year, but also for their future careers as well.”

Relating to the outlined objectives the project successfully designed a prototype assessment which considers the importance of maintaining wellbeing and utilising problem-solving skills. The project will have a positive impact on the individual not only in their placement year but also if they choose to go into a clinical career after university as skills are transferable.

Traffic Light Mood Tracker

Students are required to complete the traffic light system to indicate how they are currently managing their wellbeing. They are required to complete these three times for each blog, once before the reflection, after they have built an action plan based on their reflection and then in the last term of the academic year reflecting on their progress.

Reflections

Allán Laville’s reflections:

The project addressed a key consideration within both University training as well as within the psychological workforce, namely, the importance of explicitly considering the wellbeing of our practitioners and therapists. I am delighted with the outcome of the project and it would not have been possible without Libby. Her commitment to psychological therapies and intrinsic motivation to support others, always shines through!

Libby Adams’ reflections:

The student-staff partnership is key to improving the overall teaching and learning experience. The partnership allows the member of staff to lead as the expert by knowledge and the student to lead as the expert by experience. Such partnerships allow the development of concepts and improvements in teaching and learning which enhance the student and staff experience.

Follow up

In the future we aim to share our findings with other MSci courses and IAPT services with an aim to increase conversations about practitioner wellbeing and highlight its importance within clinical roles. We hope that strategies used in this project can extend beyond students and be used across IAPT services to maintain wellbeing, improve performance and decrease stress and burnout.

Clinical skills development: using controlled condition assessment to develop behavioural competence aligned to Miller’s pyramid

Kat Hall,  School of Chemistry, Food and Pharmacy, k.a.hall@reading.ac.uk

Overview

The Centre for Inter-Professional Postgraduate Education and Training (CIPPET) provide PGT training for healthcare professionals through a flexible Masters programme built around blended learning modules alongside workplace-based learning and assessment.  This project aimed to evolve the department’s approach to delivering one of our clinical skills workshops which sits within a larger 60 credit module.  The impact was shown via positive student and staff feedback, as well as interest to develop a standalone module for continuing further learning in advanced clinical skills.

Objectives

The aim of this project was to use controlled condition assessment approaches to develop behavioural competence at the higher levels of Miller’s pyramid of clinical competence 1.

Miller’s Pyramid of Clinical Competence

The objectives included:

  1. engage students in enquiry by promoting competence at higher levels of Miller’s pyramid
  2. develop highly employable graduates by identifying appropriate skills to teach
  3. evolve the workshop design by using innovative methods
  4. recruit expert clinical practitioners to support academic staff

Context

Health Education England are promoting a national strategy to increase the clinical skills training provided to pharmacists, therefore this project aimed to evolve the department’s approach to delivering this workshop.  The current module design contained a workshop on clinical skills, but it was loosely designed as a large group exercise which was delivered slightly differently for each cohort.  This prevented students from fully embedding their learning through opportunities to practise skills in alongside controlled formative assessment.

Implementation

Equipment purchase: As part of this project matched funding was received from the School to support the purchase of simulation equipment which meant a range a clinical skills teaching tools could be utilised in the workshops.  This step was undertaking collaboratively with the physician associate programme to share learning and support meeting objective 2 across the School.

Workshop design: the workshops were redesigned by the module convenor, Sue Slade, to focus on specific aspects of clinical skills that small groups could focus on with a facilitator.  The facilitators were supported to embed the clinical skills equipment within the activities therefore promoting students in active learning activities.  The equipment allowed students the opportunity to simulate the skills test to identify if they could demonstrate competence at the Knows How and Shows How level of Miller’s Pyramid of Clinical Competence.  Where possible the workshop stations were facilitated by practising clinical practitioners.  This step was focused on meeting objectives 1, 2, 3 and 4.

Workbook design: a workbook was produced that students could use to identify core clinical skills they required in their scope of practice and thus needed to practise in the workshop and further in their workplace-based learning.  This scaffolding supported their transition to the Does level of Miller’s Pyramid of Clinical Competence.  This step was focused on meeting objectives 1 and 3.

Impact

All four objectives were met and have since been mapped to the principles of Curriculum Framework to provide evidence of their impact.

Mastery of the discipline / discipline based / contextual: this project has supported the academic team to redesign the workshop around the evolving baseline core knowledge and skills required of students.  Doing this collaboratively between programme teams ensures it is fit for purpose.

Personal effectiveness and self-awareness / diverse and inclusive: the positive staff and student feedback received reflects that the workshop provides a better environment for student learning, enabling them to reflect on their experiences and take their learning back to their workplace more easily.

Learning cycle: the student feedback has shown that they want more of this type of training and so the team have designed a new stand-alone module to facilitate extending the impact of increasingly advanced clinical skills training to a wider student cohort.

Reflections

What went well? The purchase of the equipment and redesigning the workshop was a relatively simple task for an engaged team, and low effort for the potential return in improved experience.  By having one lead for the workshop, whilst another wrote the workbook and purchased the equipment, this ensured that staff across the team could contribute as change champions.  Recruitment for an advanced nurse practitioner to support the team more broadly was completed quickly and provided support and guidance across the year.

What did not go as well?  Whilst the purchase of the equipment and workshop redesign was relatively simple, encouraging clinical practitioners to engage with the workshop proved much harder.  We were unable to recruit consistent clinical support which made it harder to fully embed the project aims in a routine approach to teaching the workshop.  We considered using the expertise of the physician associate programme team but, as anticipated, timetabling made it impossible to coordinate the staffing needs.

Reflections: The success of the project lay in having the School engaged in supporting the objectives and the programme team invested in improving the workshop.  Focusing this project on a small part of the module meant it remained achievable to complete one cycle of change to deliver initial positive outcomes whilst planning for the following cycles of change needed to fully embed the objectives into routine practice.

Follow up

In planning the next series of workshops, we plan to draw more widely on the University alumni from the physician associate programme to continue the collaborative approach and attract clinical practitioners more willing to support us who are less constrained by timetables and clinical activities.

Based on student and staff feedback there is clearly a desire for more teaching and learning of this approach and being able to launch a new standalone module in 2020 is a successful output of this project.

Links and References

Miller, G.E. (1990). The assessment of clinical skills/competence/performance. Acad Med, 65(9):S63-7.

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 2)

Dr Madeleine Davies and Michael Lyons, School of Literature and Languages

Overview

The Department of English Literature (DEL) has run two student focus groups and two whole-cohort surveys as part of our Teaching and Learning Development Fund‘Diversifying Assessments’ project. This is the second of two T&L Exchange entries on this topic. Click here for the first entry which outlines how the feedback received from students indicates that their module selection is informed by the assessment models that are used by individual modules. Underpinning these decisions is an attempt to avoid the ‘stress and anxiety’ that students connect with exams. The surprise of this second round of focus groups and surveys is the extent to which this appears to dominate students’ teaching and learning choices.

Objectives

  • The focus groups and surveys are used to gain feedback from DEL students about possible alternative forms of summative assessment to our standard assessed essay + exam model. This connects with the Curriculum Framework in its emphasis on Programme Review and also with the aims of the Assessment Project.
  • These forms of conversations are designed to discover student views on the problems with existing assessment patterns and methods, as well as their reasons for preferring alternatives to them.
  • The conversations are also being used to explore the extent to which electronic methods of assessment can address identified assessment problems.

Context

Having used focus groups and surveys to provide initial qualitative data on our assessment practices, we noticed a widespread preference for alternatives to traditional exams (particularly the Learning Journal), and decided to investigate the reasons for this further. The second focus group and subsequent survey sought to identify why the Learning Journal in particular is so favoured by students, and we were keen to explore whether teaching and learning aims were perceived by students to be better achieved via this method than by the traditional exam. We also took the opportunity to ask students what they value most in feedback: the first focus group and survey had touched on this but we decided this time to give students the opportunity to select four elements of feedback which they could rank in order or priority. This produced more nuanced data.

Implementation

  • A second focus group was convened to gather more detailed views on the negative attitudes towards exams, and to debate alternatives to this traditional assessment method.
  • A series of questions was asked to generate data and dialogue.
  • A Survey Monkey was circulated to all DEL students with the same series of questions as those used for the focus group in order to determine whether the focus group’s responses were representative of the wider cohort.
  •  The Survey Monkey results are presented below. The numbers refer to student responses to a category (eg. graphic 1, 50 students selected option (b). Graphic 2 and graphic 5 allowed students to rank their responses in order or priority.

Results

  • Whilst only 17% in the focus group preferred to keep to the traditional exam + assessed essay method, the survey found the aversion to exams to be more prominent. 88% of students preferred the Learning Journal over the exam, and 88% cited the likelihood of reducing stress and anxiety as a reason for this preference.
  • Furthermore, none of the survey respondents wanted to retain the traditional exam + assessed essay method, and 52% were in favour of a three-way split between types of assessment; this reflects a desire for significant diversity in assessment methods.
  • We find it helpful to know precisely what students want in terms of feedback: ‘a clear indication of errors and potential solutions’ was the overwhelming response. ‘Feedback that intersects with the Module Rubric’ was the second highest scorer (presumably a connection between the two was identified by students).
  • The students in the focus group mentioned a desire to choose assessment methods within modules on an individual basis. This may be one issue in which student choice and pedagogy may not be entirely compatible (see below).
  • Assessed Essay method: the results seem to indicate that replacing an exam with a second assessed essay is favoured across the Programme rather than being pinned to one Part.

Reflections

The results in the ‘Feedback’ sections are valuable for DEL: they indicate that clarity, diagnosis, and solutions-focused comments are key. In addressing our feedback conventions and practices, this input will help us to reflect on what we are doing when we give students feedback on their work.

The results of the focus group and of the subsequent survey do, however, raise some concerns about the potential conflict between ‘student choice’ and pedagogical practice. Students indicate that they not only want to avoid exams because of ‘stress’, but that they would also like to be able to select assessment methods within modules. This poses problems because marks are in part produced ‘against’ the rest of the batch: if the ‘base-line’ is removed by allowing students to choose assessment models, we would lack one of the main indicators of level.

In addition, the aims of some modules are best measured using exams. Convenors need to consider whether a student’s work can be assessed in non-exam formats but, if an exam is the best test of teaching and learning, it should be retained, regardless of student choice.

If, however, students overwhelmingly choose non-exam-based modules, this would leave modules retaining an exam in a vulnerable position. The aim of this project is to find ways to diversify our assessments, but this could leave modules that retain traditional assessment patterns vulnerable to students deselecting them. This may have implications for benchmarking.

It may also be the case that the attempt to avoid ‘stress’ is not necessarily in students’ best interests. The workplace is not a stress-free zone and it is part of the university’s mission to produce resilient, employable graduates. Removing all ‘stress’ triggers may not be the best way to achieve this.

Follow up

  • DEL will convene a third focus group meeting in the Spring Term.
  • The co-leaders of the ‘Diversifying Assessments’ project will present the findings of the focus groups and surveys to DEL in a presentation. We will outline the results of our work and call on colleagues to reflect on the assessment models used on their modules with a view to volunteering to adopt different models if they think this appropriate to the teaching and learning aims of their modules
  • This should produce an overall assessment landscape that corresponds to students’ request for ‘three-way’ (at least) diversification of assessment.
  • The new landscape will be presented to the third focus group for final feedback.

Links

With thanks to Lauren McCann of TEL for sending me the first link which includes a summary of students’ responses to various types of ‘new’ assessment formats.

https://www.facultyfocus.com/articles/online-education/assessment-strategies-students-prefer/

Conclusions (May 2018)

The ‘Diversifying Assessment in DEL’ TLDF Mini-Project revealed several compelling reasons for reflecting upon assessment practice within a traditional Humanities discipline (English Literature):

  1. Diversified cohort: HEIs are recruiting students from a wide variety of socio-cultural, economic and educational backgrounds and assessment practice needs to accommodate this newly diversified cohort.
  2. Employability: DEL students have always acquired advanced skills in formal essay-writing but graduates need to be flexible in terms of their writing competencies. Diversifying assessment to include formats involving blog-writing, report-writing, presentation preparation, persuasive writing, and creative writing produces agile students who are comfortable working within a variety of communication formats.
  3. Module specific attainment: the assessment conventions in DEL, particularly at Part 2, have a standardised assessment format (33% assessed essay and 67% exam). The ‘Diversifying Assessment’ project revealed the extent to which module leaders need to reflect on the intended learning outcomes of their modules and to design assessments that are best suited to the attainment of them.
  4. Feedback: the student focus groups convened for the ‘Diversifying Assessment’ project returned repeatedly to the issue of feedback. Conversations about feedback will continue in DEL, particularly in relation to discussions around the Curriculum Framework.
  5. Digitalisation: eSFG (via EMA) has increased the visibility of a variety of potential digital assessment formats (for example, Blackboard Learning Journals, Wikis and Blogs). This supports diversification of assessment and it also supports our students’ digital skills (essential for employability).
  6. Student satisfaction: while colleagues should not feel pressured by student choice (which is not always modelled on academic considerations), there is clearly a desire among our students for more varied methods of assessment. One Focus Group student argued that fees had changed the way students view exams: students’ significant financial investment in their degrees has caused exams to be considered unacceptably ‘high risk’. The project revealed the extent to which Schools need to reflect on the many differences made by the new fees landscape, most of which are invisible to us.
  7. Focus Groups: the Project demonstrated the value of convening student focus groups and of listening to students’ attitudes and responses.
  8. Impact: one Part 2 module has moved away from an exam and towards a Learning Journal as a result of the project and it is hoped that more Part 2 module convenors will similarly decide to reflect on their assessment formats. The DEL project will be rolled out School-wide in the next session to encourage further conversations about assessment, feedback and diversification. It is hoped that these actions will contribute to Curriculum Framework activity in DEL and that they will generate a more diversified assessment landscape in the School.

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 1)

Dr Madeleine Davies, School of Literature and Languages

Overview

The Department of English Literature (DEL) is organising student focus groups as part of our TLDF-funded ‘Diversifying Assessments’ project led by Dr Chloe Houston and Dr Madeleine Davies. This initiative is in dialogue with Curriculum Framework emphases engaging students in Programme Development and involving them as stakeholders. This entry outlines the preparatory steps taken to set up our focus groups, the feedback from the first meeting, and our initial responses to it.

Objectives

  • To involve students in developing a more varied suite of assessment methods in DEL.
  • To hear student views on existing assessment patterns and methods.
  • To gather student responses to electronic methods of assessment (including learning journals, blogs, vlogs and wikis).

Context

We wanted to use Curriculum Framework emphases on Programme Review and Development to address assessment practices in DEL. We had pre-identified areas where our current systems might usefully be reviewed and we decided to use student focus groups to provide valuable qualitative data about our practices so that we could make sure that any changes were informed by student consultation.

Implementation

I attended a People Development session ‘Conducting Focus Groups’ to gather targeted knowledge about setting up focus groups and about analytical models of feedback evaluation. I also attended a CQSD event, ‘Effective Feedback: Ensuring Assessment and Feedback works for both Students and Staff Across a Programme’, to gain new ideas about feedback practice.

I applied for and won TLDF mini-project funding to support the Diversifying Assessments project. The TLDF funding enabled us to regard student focus groups as a year long consultative process, supporting a review of assessment models and feedback practices in DEL.

In Spring Term 2017, I emailed our undergraduate students and attracted 11 students for the first focus group meeting. We aim to include as diverse a range of participants as possible in the three planned focus group meetings in 2016-17. We also aim to draw contributors from all parts of the undergraduate programme.

To prepare the first focus group:

  • I led a DEL staff development session on the Diversifying Assessment project at the School of Literature and Languages’ assessment and feedback away day; this helped me to identify key questions and topics with colleagues.
  • I conducted a quantitative audit of our assessment patterns and I presented this material to the staff session to illustrate the nature of the issues we aim to address. This tabulated demonstration of the situation enabled colleagues to see that the need for assessment and feedback review was undeniable.

At the first focus group meeting, topics and questions were introduced by the two project leaders and our graduate intern, Michael Lyons, took minutes. We were careful not to approach the group with clear answers already in mind: we used visual aids to open conversation (see figures 1 and 2) and to provide the broad base of key debates. We also used open-ended questions to encourage detail and elaboration.

Group discussion revealed a range of issues and opinions that we would not have been able to anticipate had we not held the focus group:

  • Students said that a module’s assessment pattern was the key determinant in their selection of modules.
  • Some students reported that they seek to avoid exams where possible at Part Two.
  • Discussing why they avoid exams, students said that the material they learn for exams does not ‘stick’ in the same way as material prepared for assessed essays and learning journals so they feel that exams are less helpful in terms of learning. Some stated that they do not believe that exams offer a fair assessment of their work.
  • Students wholly supported the use of learning journals because they spread the workload and because they facilitate learning. One issue the students emphasised, however, was that material supporting learning journals had to be thorough and clear.
  • Presentations were not rated as highly as a learning or assessment tool, though a connection with employability was recognised.
  • Assessed essays were a popular method of assessment: students said they were proud of the work they produced for summative essays and that only ‘bunched deadlines’ caused them problems (see below). This response was particularly marked at Part Two.
  • Following further discussion it emerged that our students had fewer complaints about the assessment models we used, or about the amount of assessment in the programme, than they did about the assessment feedback. This is represented below:

To open conversation, students placed a note on the scale. The question was, ‘Do we assess too much, about right, not enough?’ (‘About right’ was the clear winner).

Students placed a note on the scale: the question was, ‘Do we give you too much feedback, about right, or too little?’ (The responses favoured the scale between ‘about right’ and ‘too little’.)


The results of this exercise, together with our subsequent conversation, helped us to understand the importance of feedback to the Diversifying Assessment project; however, subsequent to the focus group meeting, the DEL Exams Board received an excellent report from our External Examiners who stated that our feedback practices are ‘exemplary’. We will disseminate this information to our students who, with no experience of feedback practices other than at the University of Reading, may not realise that DEL’s feedback is regarded as an example of best practice by colleagues from other institutions. We are also considering issuing our students with updates when assessed marking is underway so that they know when to expect their marks, and to demonstrate to them that we are always meeting the 15-day turnaround. The external examiners’ feedback will not, however, prevent us from continuing to reflect on our feedback processes in an effort to enhance them further.

Following the focus group meeting, we decided to test the feedback we had gathered by sending a whole cohort online survey: for this survey, we changed the ‘feedback’question slightly to encourage a more detailed and nuanced response. The results, which confirmed the focus group findings, are represented below (with thanks to Michael Lyons for producing these graphics for the project):

A total of 95 DEL students took part in the survey. 87% said they valued the opportunity to be assessed with diverse methods.

Assessed essays were the most popular method of assessment, followed by the learning journal. However, only a small proportion of students have been assessed with a learning journal, meaning it is likely that a high percentage of those who have been assessed this way stated it to be their preferred method of assessment.

On a scale from 0-10 (with 0 being too little, 5 about right, and 10 too much), the students gave an average score of 5.1 for the level of assessment on their programmes with 5 being both the mode and the median scores.

34% found the level of detail covered most useful in feedback, 23% the feedback on writing style, 16% the clarity of the feedback, and 13% its promptness. 7% cited other issues (e.g. ‘sensitivity’) and 7% did not respond to this question.

66% said they always submit formative essays, 18% do so regularly, 8% half of the time, 4% sometimes, and 4% never do.

40% said they always attend essay supervisions (tutorials) for their formative essays, 14% do so regularly, 10% half of the time, 22% sometimes, and 14% never do.

Impact

The focus group conversation suggested that the area on which we need to focus in DEL, in terms of diversification of assessment models, is Part Two assessment provision because Part One and Part Three already have more diversified assessments. However, students articulated important concerns about the ‘bunching’ of deadlines across the programme; it may be that we need to consider the timing of essay deadlines as much as we need to consider the assessment models themselves. This is a conversation that will be carried forward into the new academic year.

Impact 1: Working with the programme requirement (two different types of assessment per module), we plan to move more modules away from the 2000 word assessed essay and exam model that 80% of our Part Two modules have been using. We are now working towards an assessment landscape where, in the 2017-18 academic session, only 50% of Part Two modules will use this assessment pattern. The others will be using a variety of assessment models potentially including learning journals and assessed essays: assessed presentations and assessed essays: vlogs and exams: wikis, presentations and assessed essays: blogs and 5000 word module reports.

Impact 2: We will be solving the ‘bunched’ deadlines problem by producing an assessments spread-sheet that will plot each assessment point on each module to allow us to retain an overview of students’ workflow and to spread deadlines more evenly.

Impact 3: The next phase of the project will focus on the type, quality and delivery of feedback. Prior to the Focus Group, we had not realised how crucial this issue is, though the External Examiners’ 2017 report for DEL suggests that communication may be the more crucial factor in this regard. Nevertheless, we will disseminate the results of the online survey to colleagues and encourage more detail and more advice on writing style in feedback.

Anticipated impact 4: We are expecting enhanced attainment as a result of these changes because the new assessment methods, and the more even spread of assessment points, will allow students to present work that more accurately reflects their ability. Further, enhanced feedback will provide students with the learning tools to improve the quality of their work.

Reflections

Initially, I had some reservations about whether student focus groups could give us the reliable data we needed to underpin assessment changes in DEL. However, the combination of quantitative data (via the statistical audit I undertook and the online survey) and qualitative data (gathered via the focus groups and again by the online survey) has produced a dependable foundation. In addition, ensuring the inclusion of a diverse range of students in a focus group, drawn from all levels of the degree and from as many communities as possible within the cohort, is essential for the credibility of the subsequent analysis of responses. Thorough reporting is also essential as is the need to listen to what is being said: we had not fully appreciated how important the ‘bunched deadlines’, ‘exams’, and ‘feedback’ issues were to our students. Focus groups cannot succeed unless those convening them respond proactively to feedback.

Follow up

There will be two further DEL student focus group meetings, one in the Autumn Term 2017 (to provide feedback on our plans and to encourage reflection in the area of feedback) and one in the Spring Term 2018 (for a final consultation prior to implementation of new assessment strategies). It is worth adding that, though we have not yet advertised the Autumn Term focus group meeting, 6 students have already emailed me requesting a place on it. There is clearly an appetite to become involved in our assessment review and student contribution to this process has already revealed its value in terms of teaching and learning development.