Enhancing and developing the use of OneNote Class Notebook beyond the Covid-19 pandemic

Enhancing and developing the use of OneNote Class Notebook beyond the Covid-19 pandemic

 

By: Rita Balestrini, Department of Languages and Cultures, r.balestrini@reading.ac.uk

Overview

The project built on T&L innovation embraced in the Department of Languages and Cultures (DLC), where Microsoft’s OneNote Class Notebook (CN) was trialled during the Covid-19 pandemic to overcome the constraints of teaching languages remotely. The outcomes provided knowledge of DLC students’ experience of CN and understanding of the type of support needed by CN users to staff users of CN, and to colleagues in Technology Enhanced Learning (TEL) and Digital Technology Services (DTS), and informed the development of technical and pedagogical support and guidance for CN users.

Objectives

The project aimed to enhance student learning by improving the use of CN, and had the objective to facilitate sharing of knowledge and expertise, gain insights into students’ experience of CN and inform TEL and DTS decision-making on the tool.

Context

CN is a digital T&L tool to store materials (e.g. text, images, handwritten notes, links, recorded voice, videos), where students and teachers can work interactively in and outside the classroom. It is organised into three parts:

  • ‘Content Library’ – where only the teacher can add, edit, and delete materials.
  • ‘Collaboration Space’ – a place to collaborate in groups open to everyone in the class, where multiple users can work on a page simultaneously or asynchronously.
  • ‘Teacher Only’ – a private space invisible to students.
  • ‘Student Notebook’ – a private area that only a student and their teacher can see and use, where they can interact directly on a page.

CN continues to be used in some language modules as it proved to be effective beyond a remote teaching environment and offered features that supported accessibility and inclusivity in language learning.

screenshot of the classroom notebook
Screenshot of a Class Notebook © Rita Balestrini

Implementation

  • In 2022–2023, I held three sessions with DLC staff users to share practice and ideas on using CN, and record information on what support would enhance teaching with CN. I also held in-person, small group meetings with DLC CN student users from all year groups to gain insights into their experience of CN. The feedback gathered informed the development of a branched MS Forms survey, which was completed by 28 (of 50) student CN users.
  • I facilitated a cross-School (DLC, Institute of Education, Law) ‘teaching conversation’ to reflect critically on the pedagogical value of CN.
  • I wrote a project report for DTS and TEL and shared with them the findings from the needs analysis.

Impact

  • The project created a ‘space’ for the sharing of practices, knowledge, experience and expertise, which in turn, enabled the enhancement of the use of CN.
  • It enhanced students’ learning and increased their engagement with CN learning activities – as evidenced by the students’ survey.
  • As part of ‘internal monitoring and review’ practice, the outcomes of the ‘teaching conversation’, informed the School of Literature and Languages (SLL) T&L enhancement process.
  • The project should inform the integration of CN with other applications (e.g. Teams), and the provision of technical and pedagogical support for CN users.

Reflections

CN offers a paperless learning environment and facilitates the organisation of T&L materials in a clear and, ‘potentially’ visually intuitive, hierarchical structure. Students evaluated CN positively as a useful ‘digital binder’ and ‘learning tool’ (Average Rating [AR] 4.15 and 4:00 respectively, on a scale of 1 to 5). Most of them felt that materials and resources were easy to access (AR 3.89), and it was easy to take notes within CN (AR 3.74).

CN users generally agreed that navigation in CN is quite ‘fluid’ compared to Blackboard. However, I think that for this fluidity to be fully meaningful pedagogically CN requires a thoroughly thought-out structure, with reasoned and transparent ‘labelling’ throughout the learning environment.

There can be issues in meeting current assessment policies when using CN for summative assessment. However, CN greatly facilitates the provision of feedback with a digital pen or by audio-recording on a page. Staff value the possibility of monitoring individual and group activities and providing private, individualised feedback in different formats; students appreciate highly receiving feedback directly in their personal notebook, which stands out as a noteworthy result of the survey (AR 4.26), especially considering that feedback in general ‘is often framed as the dimension of students’ experience with which they are least satisfied’ (Winstone & Carless, 2020, p. 5).

The ‘Collaboration Space’ can be used for class activities, collaborative projects, sharing resources, and a channel for students’ voice – additional uses of this area depend on the subject taught. CN allows students to take ownership of a shared area and use it for independently chosen purposes, which helps create a sense of ‘community’ and a feeling of ‘online connectedness’ (Hehir et al., 2021).

Regarding technical issues, 68% of respondents did not report any. The others mentioned a variety of problems (e.g. syncing issues, ‘handwriting’ and ‘highlighting’ not anchored to text). Many reported difficulties were linked to using different CN versions, devices, or operating systems, which suggests that improvements could come with advice from technical support specialists.

Follow up

In future, students’ experience of using CN could greatly benefit from:

  • staff sharing and discussing practices across different subjects to facilitate pedagogical enhancements (e.g., communities of practice, special interest groups, TEL forums);
  • making access to CN easier and facilitating its integration with Teams and Blackboard;
  • the availability of expert support from DTS and TEL (e.g. technical assistance, TEL resources and sessions for users).

References

  • Hehir, H., Zeller M., Luckhurst, J., & Chandler T. (2021) Developing student connectedness under remote learning using digital resources: a systematic review. Education and Information Technologies, 26, 6531-6548. https://doi.org/10.1007/s10639-021-10577-1
  • Winstone, N., & Carless, D. (2020) Designing effective feedback processes in higher education. Routledge.
Reframing success in a partnership project

Reframing success in a partnership project

Associate Professor Amanda Millmore, School of Law

 

Objectives

  • Curriculum development – reviewing & designing materials and the Blackboard framework for a new elective first year module.
  • Peer mentoring – student partners in Part 2 offering support to students on the module, embedded within the module by linking student partners directly with each seminar group and including them in online drop-ins and in-person teaching.

Context

During the Covid-19 pandemic, our students had struggled with their sense of belonging, not feeling part of the School of Law community due to lockdowns, online teaching and restrictions on gathering socially. We were creating a new elective, Part 1 law module called “Law and Society”, and we wanted to work with students to develop the module. We were also conscious that we needed to improve support for our new first-year students to ease their transition into university and their studies by enhancing their sense of belonging. We came up with the idea of supporting the new students by building bridges with the cohort in the year above.

Implementation

Curriculum Design – the student partners worked together with staff to review the materials we had prepared and giving their thoughts on what would be helpful and work for the new Part 1 students.

Peer Mentoring – we embedded student partners as mentors with individual seminar groups. We introduced them online  with a dedicated “Mentor” section on Blackboard, hosted a “Q&A” Padlet board for students to interact anonymously if they wished. The module was designed with the mentors embedded into it. Student partners were each paired with one of the teaching academics on the module to provide support. Mentors were timetabled to join online optional drop-in sessions  (and the session was headed “Meet the Mentors”) and compulsory seminars to offer support with groupwork and formative activities. Academic staff highlighted the benefits of peer support and promoted the mentors and how they could help, while mentors encouraged formal and informal contact with the students in their designated classes.

When student mentees did not attend the optional drop-in (we had more student partners attending than we did students enrolled on the module) we pivoted to the student partners sharing their advice for new students, which we recorded in a document that we shared on Blackboard.

Impact

Curriculum Design – this aspect of the project was very successful, with student partners feeding into the design of the Blackboard module, reviewing the module materials to ensure that they were engaging and pitched at the appropriate level and on student recommendation we ensured the provision of clickable Talis reading lists.

Peer Mentoring – this aspect fell flat, as the Part 1 students did not want to be mentored. They did not attend sessions where the mentors were offering support, declined offers of help (even when they volunteered to join a WhatsApp group) and the student partners felt that we were flogging a dead horse trying to mentor first-year students who did not want to be mentored. Student partners then pivoted to carry out some research to find out what the barriers to engagement with the project were; beset with difficulties in seeking feedback from the Part 1 students who did not respond to questionnaires, offers of coffee and cake or focus groups, the few who did participate explained that they just did not feel the need for that kind of peer support.

Reflection

Whilst the mentoring aspect of the project did not land successfully with the Part 1 students, it was not due to problems with the partnership or even the design of the project, it was just that the Part 1 cohort did not want the support that we were offering. This may be peculiar to this particular cohort, who had been significantly affected by Covid at school, but it was not for want of trying.

Whilst not one of our explicit aims, the notable success of our partnership is the value to the student partners who worked as module designers, mentors and researchers, these students have had the opportunity to disseminate their experiences at conferences and in writing and can see real benefits to their partnership experiences, and they have developed tangible employability attributes, not least a high degree of resilience.

a group of women in business attire standing in front of a white and wood panelled wall

Amanda Millmore and student partners before presenting at the Change Agents’ Network conference 2022

Follow-up

Student partners co-presented this project at the CAN (Change Agents’ Network) conference at UCL in summer 2022 and we have now co-authored a journal article sharing our experiences.

We have continued with the good curriculum developments in the module, which continues to grow from strength to strength. The mentoring aspect of the project has not continued, but instead we ensure to signpost our students to their STaR mentors and PAL leaders for peer support.

Partnership working in the School of Law continues to be business as usual, and the hiccups on this project have not deterred us from trying new things with our student partners, ensuring that we see the benefits of partnership as part of the process and the positives for the partners.

Links

We contributed to a blog after the CAN conference: CAN Case Study: A Pivoting Partnership – Student Mentors Trying to Engage: a Tale of Trial & Error | CAN 2022 (ucl.ac.uk)

Forthcoming article (co-authored by staff & student partners) in the Journal of Educational Innovation, Partnership & Change in 2023 – currently in copy editing phase – will share a link once it’s available.

If you’d like to know more about staff-student partnership in the School of Law, you can reach me at a.millmore@reading.ac.uk


 

Using student feedback to make university-directed learning on placement more engaging

Anjali Mehta Chandar: a.m.chandar@reading.ac.uk

Charlie Waller Institute, School of Psychology and Clinical Language Sciences

 

Overview

Our vocational postgraduate courses in Cognitive Behavioural Therapy include University Directed Learning (UDL) days that are completed within the placement setting (e.g. their NHS trust). A qualitative student feedback survey allowed us to collaboratively adapt this format, with favourable outcomes in how interesting, enjoyable and useful the students found the day.

Objectives

Our objectives were as follows:

-To ascertain how interesting, enjoyable and useful the UDL days were, as perceived by the students, based on pedagogical findings that students engage best and are most satisfied, if these characteristics are met (e.g. Ramsden, 2003).

-To make improvements to the UDL days based on qualitative student feedback.

-To ascertain whether our improvements had made the UDL days more interesting, enjoyable and useful, as perceived by the next cohort of students.

Context

The Educational Mental Health Practitioner (EMHP) and Children’s Wellbeing Practitioner (CWP) programmes are one-year vocational postgraduate courses. The students are employed by an NHS trust, local authority or charity, and study at UoR to become qualified mental health practitioners.

UDL days make up a small proportion of the teaching days. They are self-guided teaching days, usually containing elements of e-learning, designed to complement and consolidate face to face teaching (live or remote). A combination of learning methods, including e-learning, is shown to be effective in increasing clinical skills (e.g. Sheikhaboumasoudi et al., 2018).

UDL days had been poorly received by our two 2019/2020 cohorts, according to feedback in the student rep meetings and Mentimeter feedback after each UDL e.g.  comments included: ‘there was too much [content] for one day’, ‘I felt pressured to fill [the form] out rather than focussing on the readings themselves’ and ‘[the reflective form] was too long and too detailed’. Whilst this gave us some ideas on changes to make, I was aware of the low completion rates of the Mentimeter feedback. Therefore, to hear from more voices, we decided to create a specific feedback survey about the UDLs to help us make amendments in a collaborative way.

Implementation

We started by creating a survey for the current students to ascertain their views on how interesting, enjoyable and useful the UDL days were. We also had qualitative questions regarding what they liked and disliked and ideas for specific improvements.

I then led a meeting with my course team to explore the key findings. We agreed to make several changes based on the specific feedback, such as:

– variety of activities (not purely e-learning, but roleplays, videos, self-practice self-reflection tasks, group seminars run by lecturers, etc, to provide a more engaging day)
– fewer activities (we aimed for one main activity for every 1-1.5 hours to manage workload demands)
– an option to complete the programme’s reflective form (designed to be more simple, by asking them to provide their own notes on each task) or provide their notes in a format of their choice (e.g. mindmaps, vlogs, etc) to increase accessibility.
– share these reflections on a discussion board for other students and the lecturer to comment on.

We were unable to implement these changes to the current cohort as they had finished all their UDL days in the timetable, so made the changes for the following cohorts in 2020/2021.

We then sought their feedback via a new survey to ascertain their views on how interesting, enjoyable and useful the UDLs are, with additional questions relating to specific feedback on the new elements.

Impact

The survey results for the newer cohorts were much more positive than the original cohort, after changes were made to the UDL format.

There was a significant increase in how interesting, enjoyable and useful the students found the days.

The trainees also largely agreed that the UDLs had an appropriate workload, e.g. one task per 1-1.5 hours.

They also largely agreed that UDLs included interactive and varied tasks. This finding is in contrast to some of the aforementioned literature of the importance of e-learning, and it must be remembered that too much e-learning can be less engaging for trainees.

The students also praised the simple reflective form as a helpful tool, and many appreciated the option to submit notes in their own preferred way.

Although we neglected to explore the role of the lecturer feedback in the new UDL survey, research shows that this makes for a more engaging e-learning session (Dixson, 2010), and may explain why the UDLs were now more favourable.

Moreover, the process of collecting data from the students via a feedback form seemed effective, in that we used feedback to adapt the specific teaching method, thus improving student satisfaction. Pedagogical research shows the importance of using qualitative questions (instead of, or as well as, quantitative methods) to elicit student feedback (Steyn et al., 2019).

Reflection

Overall, this redesign was successful, which may be down to the fact we used the student voice to make meaningful changes. This is in line with Floden’s (2017) research that student feedback can help to improve courses.

Furthermore, the changes we have made are in line with effective practice amongst other courses and universities, e.g. appropriate workload (Ginn et al., 2007), student choice of discussion format (Lin & Overbaugh, 2007), accessibility of resources (Mahmood et al., 2012) and lecturer interaction (Dixson, 2010).

There is a possible limitation in this case study, in that our more recent cohorts are generally happier on the course, and therefore may be more positive about the UDL. In future projects, it would be useful if we can notice themes within module evaluation/student rep meetings earlier, to then elicit specific survey feedback earlier in the course and make amendments sooner, allowing feedback from the same cohort.

In future variations of the survey, I would also wish to explicitly ask how trainees find sharing reflections on the Blackboard discussion groups, as this is one change we had not elicited feedback on.

Follow Ups

We have continued to utilise these changes in the UDL format with future cohorts,  e.g. reduced workload, variety of activities, simplified forms, choice of discussion format and lecturer interaction. We no longer receive concerns about these days in the student rep meetings since the original cohort. The Mentimeter feedback at the end of each UDL is generally positive, with one person recently commenting: ‘this was a very engaging day’.

References

References:

Dixson, M. D. (2010). Creating effective student engagement in online courses: What do students find engaging?. Journal of the Scholarship of Teaching and Learning, 1-13.

Flodén, J. (2017). The impact of student feedback on teaching in higher education. Assessment & Evaluation in Higher Education42(7), 1054-1068.

Ginns, P., Prosser, M., & Barrie, S. (2007). Students’ perceptions of teaching quality in higher education: The perspective of currently enrolled students. Studies in higher education32(5), 603-615.

Lin, S. Y., & Overbaugh, R. C. (2007). The effect of student choice of online discussion format on tiered achievement and student satisfaction. Journal of Research on technology in Education39(4), 399-415.

Mahmood, A., Mahmood, S. T., & Malik, A. B. (2012). A comparative study of student satisfaction level in distance learning and live classroom at higher education level. Turkish Online Journal of Distance Education13(1), 128-136.

Ramsden, P. (2003). Learning to teach in higher education. Routledge.

Sheikhaboumasoudi, R., Bagheri, M., Hosseini, S. A., Ashouri, E., & Elahi, N. (2018). Improving nursing students’ learning outcomes in fundamentals of nursing course through combination of traditional and e-learning methods. Iranian journal of nursing and midwifery research, 23(3), 217.

Steyn, C., Davies, C., & Sambo, A. (2019). Eliciting student feedback for course development: the application of a qualitative course evaluation tool among business research students. Assessment & Evaluation in Higher Education, 44(1), 11-24.

Links

CWI website: https://sites.reading.ac.uk/charlie-waller-institute/

The One Where a Timetable Merger Gives Rise to a Curriculum Implementation Review

Emma-Jayne Conway, James Kachellek and Tamara Wiehe

t.wiehe@reading.ac.uk

Link back to case studies on the T and L Exchange website

Overview

Staff and students in CWI collaborated on a project initially designed to merge two timetables of sister programmes to aid cross programme working (objective 1) but gave rise to the perfect opportunity to review the way our PWP curriculum is implemented following the pandemic (objective 2). In this blog, we reflect on achieving both objectives within our original timeframe!

Objectives

1.     To create a single timetable to aid cross-programme working for teaching and administrative staff.

2.     To review curriculum implementation including structure and modality on a modular and programme level with all key stakeholders.

Context

In response to a departmental restructure, we required more efficient ways of working across programmes starting with a uniform timetable. Early on, the project evolved to also review the structure and modality of the curriculum. Our two sister PWP training programmes (one undergraduate and one postgraduate) are virtually identical with a few exceptions but historically had been managed separately.

Over the course of 2021, we planned, designed, and implemented a timetable merger for our September cohorts. This impacted on 3 modules (4 for undergraduates) that form the PWP training year for the MSci Applied Psychology (Clinical) students and the Postgraduate/graduate Certificate in Evidence-Based Psychological Treatments (IAPT Pathway).

Taking both Higher Education and Mental Health Care processes into consideration was no easy feat, including those specific to University of Reading (e.g., term dates), our national PWP curriculum specifying the content and learning outcomes for our 26 teaching days and 19 study days, and British Psychological Society (BPS) accreditation requirements. Modality was a particularly important topic throughout this project, taking key learnings from remote delivery during the pandemic as well as awaiting guidance from our professional accrediting body.

Overall, it served as an excellent opportunity to work collaboratively with staff and students to review the implementation of PWP training at the University of Reading.

Implementation

  1. Early 2021: The PWP team met on several occasions to discuss the possibility of merging the two timetables, including transitioning to a “blended” format of online and face-to-face teaching post-Covid. We set out a timeline for planning, designing, and implementing the project.
  2. Advice was sought from the Director of Training in CWI and colleagues in Academic Development, CQSD based on their experience of timetable mergers and a green light was given based on our draft plans!
  3. Several options were considered before the final format was arrived at: Face-to-face teaching is weighted towards the first module/term with progressive increase to the online taught content as the course progresses. (Rationale supplied elsewhere in this blog).
  4. The educator team were able to draw on feedback from online teaching to gauge the attitude of the student body to online learning, as well as expectations and concerns related to a return to the classroom (see Impact, below). The student voice was important in terms of utilising partnership to create meaningful change to curriculum implementation. However, engaging professional practice students via the course reps was a challenge due to time constraints, therefore, we were able to engage graduate instead. This is something we would consider earlier on in future projects.
  5. The educator team unanimously agreed that the externally taught content of the VEC module could be effectively taught with mixed cohorts from the Core-PWP and MSci cohorts using an online approach.
  6. Information on the changes was disseminated to Program Administrators to enable efficient implementation. External Educators were made aware of the retention of online lecture sessions, and the mixed-cohort approach, by the VEC module convenor.
  7. Timetables were updated by the Program Director, in collaboration with Module Convenors; consideration has been given to the potential Workflow impact of closely aligning multiple cohorts (see below). Timetables have been looked at by the team ‘side-by-side,’ to ensure that Workflow balance is maintained for educators across all cohorts. We can continue to monitor the impact on workload while adjustments are made to teaching (such as with the Working Document mentioned in the Follow-Up section, below).
  8. IAPT Services were made aware of the changes to the timetables

Impact

As of October 2021, the merged timetables are proving effective, with no major barriers, having been detected. Predicted barriers included those to effective teaching of (previously face-to-face) content, student/staff dissatisfaction with a blended approach, and significant administrative/technical difficulties.

Face-to-Face teaching resumed in September 2021 and has been a successful return to the classroom. Educators report being able to switch between live sessions and face-to-face teaching days without significant barriers.

The educator team plan to continue to gather feedback on the student experience of the blended and merged approach. We will be able to assess feedback when the first cohorts fully complete in June 2022.

Feedback will be sought from module convenors, educators, and program administrators using “menti” feedback forms, bi-weekly team meetings and informal qualitative discussion, to gauge the impact of the changes on workflow. Student feedback will also be monitored through end-of-module feedback collated by the School.

Reflection

  • The challenge of engaging professional practice students and utilising graduates to overcome this. We will consider setting up graduate/patient advisory group for future projects.
  • Using feedback from a MSci graduate led to timetable changes to ensure readability and clarity for students. This included points such as colour coding F2F v online teaching days, explaining acronyms, etc.
  • Involving all members of the team (especially Module Convenors) felt like a much more meaningful and collaborative process than Programme Director decisions alone. It gave Module Convenors autonomy over their modules as well as aligning learning outcomes across the 3 modules of the programme which is particularly important for clinical training. Other courses may wish to replicate this approach to build team cohesion and allow all colleagues to make meaningful contributions to programme changes and delivery.

Follow up

  • Working document has been created for the educator team to comment on the teaching they have just delivered i.e., was there enough time to deliver all content? This has allowed changes to be made within a matter of weeks as the same day is delivered across the programmes. As a result, we can fine-tune the timetable and delivery of the programme quicky and efficiently to improve the student experience.
  • We will review module by module and at the end of each cohort to continue making any necessary adjustments. Module and programme evaluations, student-staff rep meetings and any feedback from individual teaching days will also help to inform this.

 

Driving programme development in the IOE: student focus groups and paper writing

Jo Anna Reed Johnson – Institute of Education

j.a.reedjohnson@reading.ac.uk

Link back to case studies on the T and L Exchange website

Overview

This article outlines the thinking to drive programme development through student focus groups across three IOE programmes.  The outcome to write a paper and present at a conference helped me to frame this project with a team of academics focusing on changes made during Covid-19 (2020-2021).  This article will share reflections on setting up and running of the focus groups, the delivery of the conference presentation and the final paper writing.  Finally, it will discuss what we have learnt from this and what we will continue to do.

Objectives

  • Share 4 academic perspectives on the redesigning of three modules (SKE, STEAM, PGCE Sec Science) that all have practical elements (laboratory or school), due to Covid-19, by sharing what we did and exploring the student perspectives
  • Show how we designed opportunities for discussion and collaboration when conducting practical work or school related work online
  • Consider the use of student focus groups for programme co-development
  • Reflect on the collaborative nature of paper writing and co-programme reflections

Context

At the IOE there are a range of teacher education programmes, with a practical focus.  The four colleagues engaged in this article were involved with Skills in Schools (ED2TS1 – March to July 2020), SKE (IESKEP and PFTZSKEMATHSA– March to Aug 2020) and PGCE Secondary Science (GFSTSCIENCE – September 2020 to June 2021).  These programmes all require students to work in schools and engage in a science laboratory (if science focused).  As COVID hit in March 2020 we had to think quickly and imaginatively, transforming our provision to be online where required.  Having worked across all three programmes I felt it was pedagogically appropriate to engage our students in the ways we had throughout their learning during the pandemic, where they worked in online communities of practice to reflect.  Thus, we decided to set up a series of focus groups with students reflecting on the impact of the changes and to provide insights for future programme innovations.  This culminated in a conference presentation and paper.

Implementation

The focus was to drive programme development through reflections and shared experiences of academics and students.  I set up a project timeline and MS Team to manage and drive the deliverables, with the end goal to engage students as co-programme developers and to culminate in a conference presentation and paper.  It required framing the project, seeking ethical approval and funding, setting up focus groups to collect data, then reflections and writing up.

Framing the project allowed me to maintain the focus for the redesigning of three modules that all had practical elements (laboratory or school), due to Covid-19.  And then exploring how that had impacted on students through focus groups. It was the conference and paper deadlines that drove this activity and timeline.  At first colleagues wondered why we were writing a paper for a submission related to the School of Architecture (Manchester and Manchester Metropolitan University), but in fact it was because it was about ‘place’.  The remit was a paper related to ‘online education: teaching in a time of change’.

Seeking ethical approval and funding all required knowing where to go and what to do.  Ethical approval required submission of an ethical approval form (including consent form, interview schedule, focus group details) to the IOE ethics committee.  Then applying for funding through the University Travel Grants Scheme – Tasha Easton – e.saxon@reading.ac.uk

Data Collection was initially carried out using MS Forms, for the initial feedback request.  Consent was also required, so where this could not be achieved in person, there was a need to have consent approval attached to the top of the MS Form.  Once participants had consented and those who were willing had indicated taking part in the focus groups, I could set up a series of focus groups across the three programmes, to take place on MS Teams.  We decided to split the four sets of interviews into subject specific groups so that the conversations and reflections could be driven by the students.  One student was nominated as the chair, and they had a template of questions to guide their discussions.

Paper Writing was a challenge as we needed to fit this around our Teaching Focused roles.  I created a writing template after attending an IOE Research and Scholarship Writing Workshop with Professor Alan Floyd.  I scheduled meetings to review, discuss and allocate sections of writing.

The whole process began in December 2020 and continued through to 30 May 2021, with the conference in 21-23 April 2021 (July 2021- Paper Publication).

 

Impact

There were several elements of impact:

  • Working collaboratively with colleagues to reflect on programme development
  • Engaging students as co-programme developers
  • Attending a conference (where funding allowed)
  • Conference paper presentation
  • Conference paper publication

Reflection

In terms of the setting up of focus groups and driving data collection, we learnt that we needed to be organised, and the timeline/plan really helped to keep that focus.  There were times where we were too busy, but we had to create time as we had deliverables to meet.  If we had not have had those deliverables of a conference presentation and paper, we may have let this slip and do it ‘next year’.

Writing the paper was a challenge in that we had not done this together before, and some colleagues had not written an academic paper in a very long time, or even an educational one.  So, creating that writing template and allocating tasks worked.

Gaining conference funding can always be a challenge.  But reaching out and asking was the first thing to do. Then finding out what could be offered at the University/School Level.  Next time, we would all like to attend the conference.  Being an online conference made it more difficult to engage, and I think next time we would plan to all get funding an attend a face-to-face conference so that we too can benefit from being part of the Community of Practice.

What we will continue to do….

  • Develop students as programme co-developers through focus groups, engaging them in the paper writing.
  • Use focus groups to help us (academics) reflect on our own practice and discuss developments across programmes.
  • Drive programme development through the sharing of practices, building communities of practice with timelines and deliverables.

What else will we do…

  • Engage students in the paper writing and conference process.
  • Seek funding to attend a F2F conference with colleagues to allows us time and space to continue to reflect on practice.

Links

Research and Travel Grants Committee: https://www.reading.ac.uk/closed/research/committees/res-researchtravelgrantsubcommittee.aspx

AMPS Conference 21-23 April 2021 – https://architecturemps.com/online-ed-conference/

Online Delivery of Practical Learning Elements for SKE

Jo Anna Reed Johnson, Gaynor Bradley, Chris Turner

j.a.reedjohnson@reading.ac.uk

Overview

This article outlines the re-thinking of how to deliver the science practical elements of the subject knowledge enhancement programme (SKE) due to the impact of Covid-19 in March 2020 and a move online.  This focuses on what we learnt from the practical elements of the programme delivery online as it required students to engage in laboratory activities to develop their subject knowledge skills in Physics, Chemistry and Biology related to school national curriculum over two weeks in June and July 2021.  Whilst there are some elements of the programme we will continue to deliver online post Covid-19, there are aspects of practical work our students would still benefit from hands on experience in the laboratory, with the online resources enhancing that experience.

Objectives

  • Redesign IESKEP-19-0SUP: Subject Knowledge Enhancement Programmes so it was COVID-safe and fulfilled the programme objectives in terms of Practical work
  • Redesign how students could access the school science practical work with no access to laboratories. This relates to Required Practical work for GCSE and A’Levels
  • Ensure opportunities for discussion and collaboration when conducting practical work
  • Provide students with access to resources (posted and shopping list)

Review student perspectives related to the response to online provision

Context

In June 2020 there was no access to the science laboratories at the Institute of Education (L10) due to the Covid pandemic. The Subject Knowledge Enhancement Programme (SKE) is a pre-PGCE requirement for applicants who want to train to be a teacher but may be at risk of not meeting the right level of subject knowledge (Teacher Standard 3) by the end of their PGCE year (one-year postgraduate teacher training programme).  We had 21 SKE Science (Physics, Chemistry, Biology) students on the programme, 3 academic staff, one senior technician and two practical weeks to run.  We had to think quickly and imaginatively.  With a plethora of school resources available online for school practical science we set about reviewing these and deciding what we might use to provide our students with the experience they needed.  In addition, in terms of the programme content for the practical weeks we streamlined our provision, as working online requires more time.

Implementation

In May 2020, the senior science technician was allowed access to the labs.  With a lot of work, she prepared a small resource pack with some basic equipment that was posted to each student.  This was supplemented with a shopping list that students would prepare in advance of the practical weeks.  For the practical week programme, we focused on making use of free videos that are available on YouTube and the Web (https://www.youtube.com/watch?v=jBVxo5T-ZQM

and https://www.youtube.com/watch?v=SsKVA88oG-M&list=PLAd0MSIZBSsHL8ol8E-a-xgdcyQCkGnGt&index=12).

Having been part of an online lab research project at the University of Leicester I introduced this to students for simulations along with PHET (https://www.golabz.eu/ and https://phet.colorado.edu/).

 

 

 

 

 

 

 

We also wanted students to still feel some sense of ‘doing practical work’ and set up the home labs for those topics we deemed suitable e.g. heart dissection, quadrats, making indicators.

 

 

Powerpoints were narrated or streamed.  We set up regular meetings with colleagues to meet in the morning before each practical day started.  We did live introductions to the students to outline the work to be covered that day.  In addition, we organised drop-in sessions such as mid-morning breaks and end of day reviews with the students for discussion.  Throughout, students worked in groups where they would meet, discuss questions and share insights.  This was through MS Teams meetings/channels where we were able to build communities of practice.

Impact

End of programme feedback was received via our usual emailed evaluation sheet at the end of the practical weeks and end of the programme.  5/21 responses.  The overall feedback was that students had enjoyed the collaboration and thought the programme was well structured.

I particularly enjoyed the collaborative engagement with both my colleagues and our tutors. Given that these were unusual circumstances, it was important to maintain a strong team spirit as I felt that this gave us all mechanisms to cope with those times where things were daunting, confusing etc but also it gave us all moments to share successes and achievements, all of which helped progression through the course. I felt that we had the right blend of help and support from our tutors, with good input balancing space for us to collaborate effectively.’

Student Feedback initial evaluation

‘I enjoyed “meeting” my SKE buddies and getting to know my new colleagues. I enjoyed A Level Practical Week and found some of the online tools for experimentation and demonstrating experiments very helpful’

Student Feedback initial evaluation

To supplement this feedback, as part of a small-scale scholarly activity across three programmes, we also sent out a MS Form for students to complete to allow us to gain some deeper insights into the transition to online learning.  22/100 responses.  For example, the students’ excitement of doing practical work, and the experience of using things online that they could then use in their own teaching practice:

‘…the excitement of receiving the pack of goodies through the post was real and I enjoyed that element of the program and it’s been genuinely useful. I’ve used some of those experiments that we did in the classroom and as PERSON B said virtually as well.’

Student Feedback MS Form

‘…some of the online simulators that we used in our SKE we’ve used. I certainly have used while we’ve been doing online learning last half term, like the PHET simulators and things like that…’

Student Feedback MS Form

The students who consented to take part engaged in four discussion groups (5 participants per group = 20 participants).  These took place on MS Teams and this once again highlighted the benefits of the online group engagement, as well as still being able to meet the programme objectives:

‘I just wanted to say, really. It was it was a credit to the team that delivered the SKE that it got my subject knowledge to a level where it needed to be, so I know that the people had to react quickly to deliver the program in a different way…’

Student Feedback Focus Group

There was some feedback that helped us to review and feedback into our programme development such as surprise at how much independent learning there is on the programme, and the amount of resources or other materials (e.g. exam questions).

Reflections

We adopted an integrated learning model:

 

We learnt that you do not have to reinvent the wheel.  With the plethora of tools online we just needed to be selective… asking ‘did the tool achieve the purpose or learning outcome?’.  In terms of planning and running online activities we engaged with Gilly Salmon’s (2004) 5 stage model of e-learning.   This provides structure and we would apply this to our general planning in the use of Blackboard or other TEL tools in the future.  We will continue to use the tools we used.  These are useful T&L experiences for our trainees as schools move more and more towards engagement with technology.

However, it was still thought by the students that nothing can replace actually practical work in the labs:

‘I liked the online approach to SKE but feel that lab work face-to-face should still be part of the course if possible. There are two reasons for this: skills acquisition/ familiarity with experiments and also networking and discussion with other SKE students.’                                                                                      Student Feedback MS Form

Where possible we will do practical work in the labs but supplement these with the online resources, videos and simulation applications.  We will make sure that the course structure prioritises the practical work but also incorporates aspects of online learning.

We will continue to provide collaborative opportunities and engage students online for group work and tutorials in future years.  We also found the ways in which we collaborated through communities of practice, on MS Teams Channels, was very effective.  We set up groups, who continued to work in similar ways throughout the course, who were able to share ideas by posting evidence, then engage in a discussion.  Again, this is something we will continue to do so that when our students are dispersed across the school partnership, in different locations, they can still be in touch and work on things collaboratively.

Links and References

Online Lab case Studies

https://www.golabz.eu/

https://phet.colorado.edu/

Practical week Free Videos

https://www.youtube.com/watch?v=jBVxo5T-ZQM

https://www.youtube.com/watch?v=SsKVA88oG-M&list=PLAd0MSIZBSsHL8ol8E-a-xgdcyQCkGnGt&index=12).

The impact of COVID upon practical classes in Part 1 chemistry – an opportunity to redevelop a core module

Philippa Cranwell p.b.cranwell@reading.ac.uk, Jenny Eyley, Jessica Gusthart, Kevin Lovelock and Michael Piperakis

Overview

This article outlines a re-design that was undertaken for the Part 1 autumn/spring chemistry module, CH1PRA, which services approximately 45 students per year. All students complete practical work over 20 weeks of the year. There are four blocks of five weeks of practical work in rotation (introductory, inorganic, organic and physical) and students spend one afternoon (4 hours) in the laboratory per week. The re-design was partly due to COVID, as we were forced to critically look at the experiments the students completed to ensure that the practical skills students developed during the COVID pandemic were relevant for Part 2 and beyond, and to ensure that the assessments students completed could also be stand-alone exercises if COVID prevented the completion of practical work. COVID actually provided us with an opportunity to re-invigorate the course and critically appraise whether the skills that students were developing, and how they were assessed, were still relevant for employers and later study.

Objectives

• Redesign CH1PRA so it was COVID-safe and fulfilled strict accreditation criteria.
• Redesign the experiments so as many students as possible could complete practical work by converting some experiments so they were suitable for completion on the open bench to maximise laboratory capacity
• Redesign assessments so if students missed sessions due to COVID they could still collect credit
• Minimise assessment load on academic staff and students
• Move to a more skills-based assessment paradigm, away from the traditional laboratory report.

Context

As mentioned earlier, the COVID pandemic led to significant difficulties in the provision of a practical class due to restrictions on the number of students allowed within the laboratory; 12 students in the fumehoods and 12 students on the open bench (rather than up to 74 students all using fumehoods previously). Prior to the redesign, each student completed four or five assessments per 5-week block and all of the assessments related to a laboratory-based experiment. In addition, the majority of the assessments required students to complete a pro-forma or a technical report. We noticed that the pro-formas did not encourage students to engage with the experiments as we intended, therefore execution of the experiment was passive. The technical reports placed a significant marking burden upon the academic staff and each rotation had different requirements for the content of the report, leading to confusion and frustration among the students. The reliance of the assessments upon completion of a practical experiment was also deemed high-risk with the advent of COVID, therefore we had to re-think our assessment and practical experiment regime.

Implementation

In most cases, the COVID-safe bench experiments were adapted from existing procedures, allowing processing of 24 students per week (12 on the bench and 12 in the fumehood), with students completing two practical sessions every five weeks. This meant that technical staff did not have to familiarise themselves with new experimental procedures while implementing COVID guidelines. In addition, three online exercises per rotation were developed, requiring the same amount of time as the practical class to complete therefore fulfilling our accreditation requirements. The majority of assessments were linked to the ‘online practicals’, with opportunities for feedback during online drop-in sessions. This meant that if a student had to self-isolate they could still complete the assessments within the deadline, reducing the likelihood of ECF submissions and ensuring all Learning Outcomes would still be met. To reduce assessment burden on staff and students, each 5-week block had three assessment points and where possible one of these assessments was marked automatically, e.g. using a Blackboard quiz. The assessments themselves were designed to be more skills-based, developing the softer skills students would require upon employment or during a placement. To encourage active learning, the use of reflection was embedded into the assessment regime; it was hoped that by critically appraising performance in the laboratory students would remember the skills and techniques that they had learnt better rather than the “see, do, forget” mentality that is prevalent within practical classes.

Examples of assessments include: undertaking data analysis, focussing on clear presentation of data; critical self-reflection of the skills developed during a practical class i.e. “what went well”, “what didn’t go so well”, “what would I do differently?”; critically engaging with a published scientific procedure; and giving a three-minute presentation about a practical scientific technique commonly-encountered in the laboratory.

Impact

Mid-module evaluation was completed using an online form, providing some useful feedback that will be used to improve the student experience next term. The majority of students agreed, or strongly agreed, that staff were friendly and approachable, face-to-face practicals were useful and enjoyable, the course was well-run and the supporting materials were useful. This was heartening to read, as it meant that the adjustments that we had to make to the delivery of laboratory based practicals did not have a negative impact upon the students’ experience and that the re-design was, for the most part, working well. Staff enjoyed marking the varied assessments and the workload was significantly reduced by using Blackboard functionality.

Reflections

To claim that all is perfect with this redesign would be disingenuous, and there was a slight disconnect between what we expected students to achieve from the online practicals and what students were achieving. A number of the students polled disliked the online practical work, with the main reason being that the assessment requirements were unclear. We have addressed by providing additional videos explicitly outlining expectations for the assessments, and ensuring that all students are aware of the drop-in sessions. In addition, we amended the assessments so they are aligned more closely with the face-to-face practical sessions giving students opportunity for informal feedback during the practical class.

In summary, we are happy that the assessments are now more varied and provide students with the skills they will need throughout their degree and upon graduation. In addition, the assessment burden on staff and students has been reduced. Looking forward, we will now consider the experiments themselves and in 2021/22 we will extend the number of hours of practical work that Part 1 students complete and further embed our skill-based approach into the programme.

Follow up

 

Links and References

Considering wellbeing within the placement module assessment

Allán Laville (Dean for D&I and Lecturer in Clinical Psychology) and Libby Adams (Research Assistant), SPCLS

Overview

This project aimed to design a new alternative assessment to form a part of the MSci Applied Psychology course which puts emphasis on the practical sides of training as a Psychological Wellbeing Practitioner (PWP). This included utilising problem-solving skills and wellbeing strategies.

Objectives

  • This project was funded by SPCLS Teaching & Learning Enhancement Fund and aimed to design an alternative assessment to be used as a part of the MSci Applied Psychology course to support student wellbeing.
  • The project aimed to incorporate an assignment into the curriculum which provides students with transferable problem-solving and wellbeing management strategies which can be used in future mental health support/clinical roles.

Context

The above project was undertaken as within IAPT, Psychological Wellbeing Practitioners (PWPs) are required to work in a fast-paced environment seeing multiple patients back-to-back throughout the day. Students on the MSci Applied Psychology course are required in their third year to undertake a work placement 1 day a week in the first term increasing to 2 days a week in the second term. Students are also required to undertake 1 full day of training per week. The aim of the project was to embed an assignment which focusses on managing wellbeing within the curriculum.

Implementation

Allán Laville (Dean for Diversity and Inclusion) brought to light the concept of incorporating wellbeing within the curriculum and contacted Libby Adams (Part 4 MSci Student) to see whether she would take part in the development of the new assessment. Libby Adams was included here as she previously trained as a Psychological Wellbeing Practitioner and first-hand experienced challenges managing the demands of the PWP role as a trainee and in turn managing her wellbeing.

Libby Adams’ experience

The project was developed with my own challenges in mind, to build upon this we then met with current and past MSci students to gain insight into the challenges they faced. We were then able to condense information and incorporate them within our concept of a wellbeing blog. We then considered how we could problem-solve ways around the areas that could not be included in the blog. At the second stage we met with clinical staff and educators to share our idea and gain feedback on the feasibility of implementation within IAPT services. The final project design was then formed with the above feedback in mind.

Impact

Views from current MSci students on the benefits of the project:

“I think maintaining our own wellbeing is such a critical part of caring professions, and I think that making it a clear and mandatory part of the course you’re not only helping students look after themselves for this year, but also for their future careers as well.”

Relating to the outlined objectives the project successfully designed a prototype assessment which considers the importance of maintaining wellbeing and utilising problem-solving skills. The project will have a positive impact on the individual not only in their placement year but also if they choose to go into a clinical career after university as skills are transferable.

Traffic Light Mood Tracker

Students are required to complete the traffic light system to indicate how they are currently managing their wellbeing. They are required to complete these three times for each blog, once before the reflection, after they have built an action plan based on their reflection and then in the last term of the academic year reflecting on their progress.

Reflections

Allán Laville’s reflections:

The project addressed a key consideration within both University training as well as within the psychological workforce, namely, the importance of explicitly considering the wellbeing of our practitioners and therapists. I am delighted with the outcome of the project and it would not have been possible without Libby. Her commitment to psychological therapies and intrinsic motivation to support others, always shines through!

Libby Adams’ reflections:

The student-staff partnership is key to improving the overall teaching and learning experience. The partnership allows the member of staff to lead as the expert by knowledge and the student to lead as the expert by experience. Such partnerships allow the development of concepts and improvements in teaching and learning which enhance the student and staff experience.

Follow up

In the future we aim to share our findings with other MSci courses and IAPT services with an aim to increase conversations about practitioner wellbeing and highlight its importance within clinical roles. We hope that strategies used in this project can extend beyond students and be used across IAPT services to maintain wellbeing, improve performance and decrease stress and burnout.

Clinical skills development: using controlled condition assessment to develop behavioural competence aligned to Miller’s pyramid

Kat Hall,  School of Chemistry, Food and Pharmacy, k.a.hall@reading.ac.uk

Overview

The Centre for Inter-Professional Postgraduate Education and Training (CIPPET) provide PGT training for healthcare professionals through a flexible Masters programme built around blended learning modules alongside workplace-based learning and assessment.  This project aimed to evolve the department’s approach to delivering one of our clinical skills workshops which sits within a larger 60 credit module.  The impact was shown via positive student and staff feedback, as well as interest to develop a standalone module for continuing further learning in advanced clinical skills.

Objectives

The aim of this project was to use controlled condition assessment approaches to develop behavioural competence at the higher levels of Miller’s pyramid of clinical competence 1.

Miller’s Pyramid of Clinical Competence

The objectives included:

  1. engage students in enquiry by promoting competence at higher levels of Miller’s pyramid
  2. develop highly employable graduates by identifying appropriate skills to teach
  3. evolve the workshop design by using innovative methods
  4. recruit expert clinical practitioners to support academic staff

Context

Health Education England are promoting a national strategy to increase the clinical skills training provided to pharmacists, therefore this project aimed to evolve the department’s approach to delivering this workshop.  The current module design contained a workshop on clinical skills, but it was loosely designed as a large group exercise which was delivered slightly differently for each cohort.  This prevented students from fully embedding their learning through opportunities to practise skills in alongside controlled formative assessment.

Implementation

Equipment purchase: As part of this project matched funding was received from the School to support the purchase of simulation equipment which meant a range a clinical skills teaching tools could be utilised in the workshops.  This step was undertaking collaboratively with the physician associate programme to share learning and support meeting objective 2 across the School.

Workshop design: the workshops were redesigned by the module convenor, Sue Slade, to focus on specific aspects of clinical skills that small groups could focus on with a facilitator.  The facilitators were supported to embed the clinical skills equipment within the activities therefore promoting students in active learning activities.  The equipment allowed students the opportunity to simulate the skills test to identify if they could demonstrate competence at the Knows How and Shows How level of Miller’s Pyramid of Clinical Competence.  Where possible the workshop stations were facilitated by practising clinical practitioners.  This step was focused on meeting objectives 1, 2, 3 and 4.

Workbook design: a workbook was produced that students could use to identify core clinical skills they required in their scope of practice and thus needed to practise in the workshop and further in their workplace-based learning.  This scaffolding supported their transition to the Does level of Miller’s Pyramid of Clinical Competence.  This step was focused on meeting objectives 1 and 3.

Impact

All four objectives were met and have since been mapped to the principles of Curriculum Framework to provide evidence of their impact.

Mastery of the discipline / discipline based / contextual: this project has supported the academic team to redesign the workshop around the evolving baseline core knowledge and skills required of students.  Doing this collaboratively between programme teams ensures it is fit for purpose.

Personal effectiveness and self-awareness / diverse and inclusive: the positive staff and student feedback received reflects that the workshop provides a better environment for student learning, enabling them to reflect on their experiences and take their learning back to their workplace more easily.

Learning cycle: the student feedback has shown that they want more of this type of training and so the team have designed a new stand-alone module to facilitate extending the impact of increasingly advanced clinical skills training to a wider student cohort.

Reflections

What went well? The purchase of the equipment and redesigning the workshop was a relatively simple task for an engaged team, and low effort for the potential return in improved experience.  By having one lead for the workshop, whilst another wrote the workbook and purchased the equipment, this ensured that staff across the team could contribute as change champions.  Recruitment for an advanced nurse practitioner to support the team more broadly was completed quickly and provided support and guidance across the year.

What did not go as well?  Whilst the purchase of the equipment and workshop redesign was relatively simple, encouraging clinical practitioners to engage with the workshop proved much harder.  We were unable to recruit consistent clinical support which made it harder to fully embed the project aims in a routine approach to teaching the workshop.  We considered using the expertise of the physician associate programme team but, as anticipated, timetabling made it impossible to coordinate the staffing needs.

Reflections: The success of the project lay in having the School engaged in supporting the objectives and the programme team invested in improving the workshop.  Focusing this project on a small part of the module meant it remained achievable to complete one cycle of change to deliver initial positive outcomes whilst planning for the following cycles of change needed to fully embed the objectives into routine practice.

Follow up

In planning the next series of workshops, we plan to draw more widely on the University alumni from the physician associate programme to continue the collaborative approach and attract clinical practitioners more willing to support us who are less constrained by timetables and clinical activities.

Based on student and staff feedback there is clearly a desire for more teaching and learning of this approach and being able to launch a new standalone module in 2020 is a successful output of this project.

Links and References

Miller, G.E. (1990). The assessment of clinical skills/competence/performance. Acad Med, 65(9):S63-7.

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 2)

Dr Madeleine Davies and Michael Lyons, School of Literature and Languages

Overview

The Department of English Literature (DEL) has run two student focus groups and two whole-cohort surveys as part of our Teaching and Learning Development Fund‘Diversifying Assessments’ project. This is the second of two T&L Exchange entries on this topic. Click here for the first entry which outlines how the feedback received from students indicates that their module selection is informed by the assessment models that are used by individual modules. Underpinning these decisions is an attempt to avoid the ‘stress and anxiety’ that students connect with exams. The surprise of this second round of focus groups and surveys is the extent to which this appears to dominate students’ teaching and learning choices.

Objectives

  • The focus groups and surveys are used to gain feedback from DEL students about possible alternative forms of summative assessment to our standard assessed essay + exam model. This connects with the Curriculum Framework in its emphasis on Programme Review and also with the aims of the Assessment Project.
  • These forms of conversations are designed to discover student views on the problems with existing assessment patterns and methods, as well as their reasons for preferring alternatives to them.
  • The conversations are also being used to explore the extent to which electronic methods of assessment can address identified assessment problems.

Context

Having used focus groups and surveys to provide initial qualitative data on our assessment practices, we noticed a widespread preference for alternatives to traditional exams (particularly the Learning Journal), and decided to investigate the reasons for this further. The second focus group and subsequent survey sought to identify why the Learning Journal in particular is so favoured by students, and we were keen to explore whether teaching and learning aims were perceived by students to be better achieved via this method than by the traditional exam. We also took the opportunity to ask students what they value most in feedback: the first focus group and survey had touched on this but we decided this time to give students the opportunity to select four elements of feedback which they could rank in order or priority. This produced more nuanced data.

Implementation

  • A second focus group was convened to gather more detailed views on the negative attitudes towards exams, and to debate alternatives to this traditional assessment method.
  • A series of questions was asked to generate data and dialogue.
  • A Survey Monkey was circulated to all DEL students with the same series of questions as those used for the focus group in order to determine whether the focus group’s responses were representative of the wider cohort.
  •  The Survey Monkey results are presented below. The numbers refer to student responses to a category (eg. graphic 1, 50 students selected option (b). Graphic 2 and graphic 5 allowed students to rank their responses in order or priority.

Results

  • Whilst only 17% in the focus group preferred to keep to the traditional exam + assessed essay method, the survey found the aversion to exams to be more prominent. 88% of students preferred the Learning Journal over the exam, and 88% cited the likelihood of reducing stress and anxiety as a reason for this preference.
  • Furthermore, none of the survey respondents wanted to retain the traditional exam + assessed essay method, and 52% were in favour of a three-way split between types of assessment; this reflects a desire for significant diversity in assessment methods.
  • We find it helpful to know precisely what students want in terms of feedback: ‘a clear indication of errors and potential solutions’ was the overwhelming response. ‘Feedback that intersects with the Module Rubric’ was the second highest scorer (presumably a connection between the two was identified by students).
  • The students in the focus group mentioned a desire to choose assessment methods within modules on an individual basis. This may be one issue in which student choice and pedagogy may not be entirely compatible (see below).
  • Assessed Essay method: the results seem to indicate that replacing an exam with a second assessed essay is favoured across the Programme rather than being pinned to one Part.

Reflections

The results in the ‘Feedback’ sections are valuable for DEL: they indicate that clarity, diagnosis, and solutions-focused comments are key. In addressing our feedback conventions and practices, this input will help us to reflect on what we are doing when we give students feedback on their work.

The results of the focus group and of the subsequent survey do, however, raise some concerns about the potential conflict between ‘student choice’ and pedagogical practice. Students indicate that they not only want to avoid exams because of ‘stress’, but that they would also like to be able to select assessment methods within modules. This poses problems because marks are in part produced ‘against’ the rest of the batch: if the ‘base-line’ is removed by allowing students to choose assessment models, we would lack one of the main indicators of level.

In addition, the aims of some modules are best measured using exams. Convenors need to consider whether a student’s work can be assessed in non-exam formats but, if an exam is the best test of teaching and learning, it should be retained, regardless of student choice.

If, however, students overwhelmingly choose non-exam-based modules, this would leave modules retaining an exam in a vulnerable position. The aim of this project is to find ways to diversify our assessments, but this could leave modules that retain traditional assessment patterns vulnerable to students deselecting them. This may have implications for benchmarking.

It may also be the case that the attempt to avoid ‘stress’ is not necessarily in students’ best interests. The workplace is not a stress-free zone and it is part of the university’s mission to produce resilient, employable graduates. Removing all ‘stress’ triggers may not be the best way to achieve this.

Follow up

  • DEL will convene a third focus group meeting in the Spring Term.
  • The co-leaders of the ‘Diversifying Assessments’ project will present the findings of the focus groups and surveys to DEL in a presentation. We will outline the results of our work and call on colleagues to reflect on the assessment models used on their modules with a view to volunteering to adopt different models if they think this appropriate to the teaching and learning aims of their modules
  • This should produce an overall assessment landscape that corresponds to students’ request for ‘three-way’ (at least) diversification of assessment.
  • The new landscape will be presented to the third focus group for final feedback.

Links

With thanks to Lauren McCann of TEL for sending me the first link which includes a summary of students’ responses to various types of ‘new’ assessment formats.

https://www.facultyfocus.com/articles/online-education/assessment-strategies-students-prefer/

Conclusions (May 2018)

The ‘Diversifying Assessment in DEL’ TLDF Mini-Project revealed several compelling reasons for reflecting upon assessment practice within a traditional Humanities discipline (English Literature):

  1. Diversified cohort: HEIs are recruiting students from a wide variety of socio-cultural, economic and educational backgrounds and assessment practice needs to accommodate this newly diversified cohort.
  2. Employability: DEL students have always acquired advanced skills in formal essay-writing but graduates need to be flexible in terms of their writing competencies. Diversifying assessment to include formats involving blog-writing, report-writing, presentation preparation, persuasive writing, and creative writing produces agile students who are comfortable working within a variety of communication formats.
  3. Module specific attainment: the assessment conventions in DEL, particularly at Part 2, have a standardised assessment format (33% assessed essay and 67% exam). The ‘Diversifying Assessment’ project revealed the extent to which module leaders need to reflect on the intended learning outcomes of their modules and to design assessments that are best suited to the attainment of them.
  4. Feedback: the student focus groups convened for the ‘Diversifying Assessment’ project returned repeatedly to the issue of feedback. Conversations about feedback will continue in DEL, particularly in relation to discussions around the Curriculum Framework.
  5. Digitalisation: eSFG (via EMA) has increased the visibility of a variety of potential digital assessment formats (for example, Blackboard Learning Journals, Wikis and Blogs). This supports diversification of assessment and it also supports our students’ digital skills (essential for employability).
  6. Student satisfaction: while colleagues should not feel pressured by student choice (which is not always modelled on academic considerations), there is clearly a desire among our students for more varied methods of assessment. One Focus Group student argued that fees had changed the way students view exams: students’ significant financial investment in their degrees has caused exams to be considered unacceptably ‘high risk’. The project revealed the extent to which Schools need to reflect on the many differences made by the new fees landscape, most of which are invisible to us.
  7. Focus Groups: the Project demonstrated the value of convening student focus groups and of listening to students’ attitudes and responses.
  8. Impact: one Part 2 module has moved away from an exam and towards a Learning Journal as a result of the project and it is hoped that more Part 2 module convenors will similarly decide to reflect on their assessment formats. The DEL project will be rolled out School-wide in the next session to encourage further conversations about assessment, feedback and diversification. It is hoped that these actions will contribute to Curriculum Framework activity in DEL and that they will generate a more diversified assessment landscape in the School.