Running Virtual Focus Groups – Investigating the Student Experience of the Academic Tutor System

Amanda Millmore, School of Law

Overview

I wanted to measure the impact of the new Academic Tutor System (ATS) on the students in the School of Law, and capture their experiences, both good and bad, with a view to making improvements. I successfully bid for some small project funding from the ATS Steering Group prior to Covid-19. The obvious difficulty I faced in the lockdown, was how to engage my students and encourage them to get involved in a virtual project. How can students co-produce resources when they are spread around the world?

Objectives

I planned to run focus groups with current students with dual aims:·

  • To find out more about their experiences of the academic tutor system and how we might better meet their needs; and
  • To see if the students could collaboratively develop some resources advising their fellow students how to get the most out of tutoring.

The overall aim being to raise the profile of academic tutoring within the School and the positive benefits it offers to our students, but also to troubleshoot any issues.

Implementation

After exams, I emailed all students in the School of Law to ask them to complete an anonymous online survey about their experiences. Around 10% of the cohort responded.

 

 

 

 

 

 

 

Within that survey I offered students the opportunity to join virtual focus groups. The funding had originally been targeted at providing refreshments as an enticement to get involved, so I redeployed it to offer payment via Campus Jobs for the students’ time (a remarkably easy process). I was conscious that many of our students had lost their part time employment, and it seemed a positive way to help them and encourage involvement. I had 56 volunteers, and randomly selected names, ensuring that I had representation from all year groups.

I ran 2 focus groups virtually using MS Teams, each containing students from different years. This seemed to work well for the 11 students who were all able to join the sessions and recording the sessions online enabled me to obtain a more accurate note which was particularly helpful. I was pleasantly surprised at how the conversation flowed virtually; with no more than 6 students in a group we kept all microphones on, to allow everyone to speak, and I facilitated with some prompts and encouraging quieter participants to offer their opinions.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The students were very forthcoming with advice and their honest experiences. They were clear that a good tutor relationship can make a real and noticeable difference for students and those who had had good experiences were effusive in their praise. They were keen to help me find ways to improve the system for everyone.

Results

The students collaborated to produce their “Top Tips for Getting the Most Out of Your Academic Tutor” which we have created into a postcard to share with new undergraduates, using free design software Canva https://www.canva.com/.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The students also individually made short videos at home of their own top tips, and emailed them to me; I enlisted my teenage son to edit those into 2 short videos, one aimed at postgraduates, one for undergraduates, which I can use as part of induction.

From the project I now have useful data as to how our students use their academic tutor. A thematic analysis of qualitative comments from the questionnaires and focus groups identified 5 key themes:

  • Tutor availability
  • Communication by the tutor
  • School level communication
  • Content of meetings
  • Staffing

From these themes I have drawn up a detailed action plan to be implemented to deal with student concerns.

Impact & Reflections

One of the main messages was that we need to do better at clearly communicating the role of the academic tutor to students and staff.

The students’ advice videos are low-tech but high impact, all recorded in lockdown on their phones from around the world, sharing what they wish they’d known and advising their fellow Law students how to maximise the tutor/tutee relationship. The videos have been shared with the STAR mentor team, the ATS Steering Group and the MCE team now has the original footage, to see if they can be used University-wide.

I am firmly convinced that students are more likely to listen to advice from their peers than from an academic, so am hopeful that the advice postcards and videos will help, particularly if we have a more virtual induction process in the forthcoming academic year.

Ultimately, whilst not the project I initially envisaged, our virtual focus group alternative worked well for my student partners, and they were still able to co-create resources, in a more innovative format than I anticipated. My message to colleagues is to trust the students to know what will work for their fellow students, and don’t be afraid to try something new.

 

The DEL Feedback Action Plan

Madeleine Davies, Cindy Becker and Michael Lyons- SLL

Overview

A feedback audit and consultation with the Student Impact Network revealed a set of practices DEL needs to amend. The research produced new student-facing physical and online posters, designed by a ‘Real Jobs’ student, to instruct students on finding their feedback online, and generated ‘marking checklists’ for staff to indicate what needs to be included in feedback and what needs to be avoided.

Objectives

  • To assess why students scored DEL poorly on feedback in NSS returns
  • To consult with students on types of feedback they considered useful
  • To brief colleagues on good practice feedback
  • To produce consistency (but not conformity) in terms of, for example, the amount of feedback provided, feedforward, full feedback for First Class work, etc.
  • To assess whether marking rubrics would help or hinder DEL feedback practice

Context

The ‘DEL Feedback Action Project’ addresses the persistent issue of depressed NSS responses to Department of English Literature assessment and feedback practices. The responses to questions in ‘teaching quality’ sections are favourable but the 2018 NSS revealed that, for English Studies, Reading is in the third quartile for the ’Assessment and Feedback’ section and the bottom quartile for question 8 (scoring 64% vs the 74% median score) and question 9 (scoring 70% vs the 77% median score).

In October 2018, DEL adopted eSFG. An EMA student survey undertaken in January 2019 polled 100 DEL students and found that, though students overwhelmingly supported the move to eSFG, complaints about the quality of DEL feedback persisted.

Implementation

Michael Lyons began the project with an audit of DEL feedback and identified a number of areas where the tone or content of feedback may need improving. This material was taken to the Student Impact Network which was shown anonymised samples of feedback. Students commented on it. This produced a set of indicators which became the basis of the ‘marking checklist’ for DEL staff. Simultaneously, DEL staff were asked to discuss feedback practice in ‘professional conversations’ for the annual Peer Review exercise. This ensured that the combined minds of the whole department were reflecting on this issue

Student consultation also revealed that many students struggle to find their feedback online. With this in mind, we collaborated with TEL to produce ‘maps to finding feedback’ for students. A ‘Real Jobs’ student designer converted this information into clear, readable posters which can be displayed online or anywhere in the University (the information is not DEL-specific). The posters will be of particular use for incoming students but our research also suggested that Part 3 students are often unaware of how to access feedback.

The results of the initial audit and consultation with students indicated where our feedback had been falling short. We wrote a summary of these finding for DEL HoD and DDTL.

Research into marking rubrics revealed that DEL marking would not be suited to using this feedback practice. This is because they can be inflexible and because DEL students resist ‘generic’ feedback.

Impact

The student-facing posters and staff-facing ‘marking checklist’ speak to two of the main issues with DEL feedback that were indicated by students. The latter will deter overly-brief, curt feedback and will prompt more feedforward and comment about specific areas of the essay (for example, the Introductory passage, the essay structure, referencing, grammar, use of secondary resources, etc).

With DEL staff now focused on the feedback issue, and with students equipped to access their feedback successfully, we are hoping to see a marked improvement in NSS scores in this area in 2020-21.

For ‘surprises’, see ‘Reflections’.

Reflections

The pressure on academic staff to mark significant amounts of work within tight deadlines can lead to potential unevenness in feedback. We are hoping that our research prompts DEL to streamline its assessment practice to enhance the quality and consistency of feedback and feedforward.

Students’ responses in the Student Impact Network also suggested that additional work is required on teaching students how to receive feedback. Over-sensitivity in some areas can produce negative scores. With this in mind, the project will terminate with an equivalent to the ‘marking checklist’ designed for students. This will remind students that feedback is anonymous, objective, and intended to pave the way to success.

Follow up

Monitoring NSS DEL feedback scores in the 2020-21 round, and polling students in the next session to ensure that they are now able to access their feedback.

Continuing to reflect on colleagues’ marking workload and the link between this and unconstructive feedback.

 

 

Student co-creation of course material in Contract Law

Dr Rachel Horton, School of Law

Overview

The PLaNT project involved the co-creation, with students, of a series and podcasts and other materials for Contract Law (LW1CON). Student leaders consulted with their peers to decide what materials students felt would most enhance learning on the module and then created these together with the Module Convenor.

Objectives

This project aimed to engage current law students as co-creators of course learning material.

Context

Contract Law is a large compulsory first year module – in an average year between 250 and 300 students take the module –  taught using a traditional combination of lectures and small group teaching. Module staff were keen to develop additional resources for students to access, in their own time, through Blackboard and wanted to engage students in developing these.

Implementation

Staff met with selected students to introduce a student curated Blackboard space, in which the students had authoring permissions to generate podcast feeds, which would be accessible to all students enrolled on the module.  These students were then asked to consult with their peers to generate ideas for use of the space/topics for the podcasts.

The student leaders then created a series of podcasts, largely focusing on revision materials and assessment and exam technique by interviewing lecturers on the module. The students also devised and created a series of written materials, in a variety of formats, and lecturers provided feedback on these (chiefly to ensure accuracy) before they were uploaded onto Blackboard.

Impact

The student leaders were highly engaged and enthusiastic and went well beyond their original remit in devising course content. They fed back, informally, that they had found the experience immensely beneficial to their own learning, as well as giving them the opportunity to develop a range of leadership, technical and communication skills.

Statistics on Blackboard showed that the materials were well used by the rest of the cohort, particularly in the immediate run up to the exams. While it proved difficult to recruit students for a focus group after the project had finished, in order to gain more structured feedback, student representatives commented at the Staff Student Liaison Committee that they had received very positive feedback from students about the additional materials created through the project.

Reflections

The success of the activity was largely a result of the enthusiasm, imagination and commitment of the students involved. We were lucky to recruit students who were able to work very well together, and with their peers, to create resources to genuinely enhance learning, and to fill gaps in course materials that may otherwise have gone unnoticed by staff.

The project also offered an opportunity for the teaching staff on the module to reflect on the content and format of materials students want. Even after the funded project has finished this proved very helpful in enabling us to continue to produce similar materials, particularly once teaching had to move online in the wake of COVID-19.

The project and funding began in the Spring term and with hindsight it would have been beneficial to start the project earlier in the course. In particular this would have provided opportunity for gathering more structured feedback from the whole cohort (it was difficult to secure a meaningful student response to feedback once the summer exams were over.)

Follow up

The materials produced by the students remain relevant for future cohorts and will continue to be made available. New materials will be developed along similar lines, with student input wherever possible, particularly next year as lectures move wholly online.

Considering wellbeing within the placement module assessment

Allán Laville (Dean for D&I and Lecturer in Clinical Psychology) and Libby Adams (Research Assistant), SPCLS

Overview

This project aimed to design a new alternative assessment to form a part of the MSci Applied Psychology course which puts emphasis on the practical sides of training as a Psychological Wellbeing Practitioner (PWP). This included utilising problem-solving skills and wellbeing strategies.

Objectives

  • This project was funded by SPCLS Teaching & Learning Enhancement Fund and aimed to design an alternative assessment to be used as a part of the MSci Applied Psychology course to support student wellbeing.
  • The project aimed to incorporate an assignment into the curriculum which provides students with transferable problem-solving and wellbeing management strategies which can be used in future mental health support/clinical roles.

Context

The above project was undertaken as within IAPT, Psychological Wellbeing Practitioners (PWPs) are required to work in a fast-paced environment seeing multiple patients back-to-back throughout the day. Students on the MSci Applied Psychology course are required in their third year to undertake a work placement 1 day a week in the first term increasing to 2 days a week in the second term. Students are also required to undertake 1 full day of training per week. The aim of the project was to embed an assignment which focusses on managing wellbeing within the curriculum.

Implementation

Allán Laville (Dean for Diversity and Inclusion) brought to light the concept of incorporating wellbeing within the curriculum and contacted Libby Adams (Part 4 MSci Student) to see whether she would take part in the development of the new assessment. Libby Adams was included here as she previously trained as a Psychological Wellbeing Practitioner and first-hand experienced challenges managing the demands of the PWP role as a trainee and in turn managing her wellbeing.

Libby Adams’ experience

The project was developed with my own challenges in mind, to build upon this we then met with current and past MSci students to gain insight into the challenges they faced. We were then able to condense information and incorporate them within our concept of a wellbeing blog. We then considered how we could problem-solve ways around the areas that could not be included in the blog. At the second stage we met with clinical staff and educators to share our idea and gain feedback on the feasibility of implementation within IAPT services. The final project design was then formed with the above feedback in mind.

Impact

Views from current MSci students on the benefits of the project:

“I think maintaining our own wellbeing is such a critical part of caring professions, and I think that making it a clear and mandatory part of the course you’re not only helping students look after themselves for this year, but also for their future careers as well.”

Relating to the outlined objectives the project successfully designed a prototype assessment which considers the importance of maintaining wellbeing and utilising problem-solving skills. The project will have a positive impact on the individual not only in their placement year but also if they choose to go into a clinical career after university as skills are transferable.

Traffic Light Mood Tracker

Students are required to complete the traffic light system to indicate how they are currently managing their wellbeing. They are required to complete these three times for each blog, once before the reflection, after they have built an action plan based on their reflection and then in the last term of the academic year reflecting on their progress.

Reflections

Allán Laville’s reflections:

The project addressed a key consideration within both University training as well as within the psychological workforce, namely, the importance of explicitly considering the wellbeing of our practitioners and therapists. I am delighted with the outcome of the project and it would not have been possible without Libby. Her commitment to psychological therapies and intrinsic motivation to support others, always shines through!

Libby Adams’ reflections:

The student-staff partnership is key to improving the overall teaching and learning experience. The partnership allows the member of staff to lead as the expert by knowledge and the student to lead as the expert by experience. Such partnerships allow the development of concepts and improvements in teaching and learning which enhance the student and staff experience.

Follow up

In the future we aim to share our findings with other MSci courses and IAPT services with an aim to increase conversations about practitioner wellbeing and highlight its importance within clinical roles. We hope that strategies used in this project can extend beyond students and be used across IAPT services to maintain wellbeing, improve performance and decrease stress and burnout.

Developing psychoeducational materials for individuals with learning disabilities

Dr Allán Laville, a.laville@reading.ac.uk, (Dean for D&I and Lecturer in Clinical Psychology) and Charlotte Field (Research Assistant and student on MSci Applied Psychology)

Overview

To improve access to psychoeducational materials by addressing the diverse needs of those accessing Improving Access to Psychological Therapy (IAPT) services. We worked on creating materials that could be used to describe psychological disorders such as Depression and Generalized Anxiety Disorder (GAD) to those who have learning disabilities. Here we reflect upon the benefits of completing this project via a student- staff partnership as well as the potential benefits of using within IAPT.

Objectives

  • This project was funded by SPCLS Teaching & Learning Enhancement Fund and was to create psychoeducational materials suitable for those with learning disabilities that depict Depression, GAD and Panic Disorder.
  • To effectively utilise student and staff feedback in the creation of these materials.

Context

The above project was undertaken as within IAPT, Psychological Wellbeing Practitioners (PWPs) typically use materials that are text heavy when explaining psychological disorders. This can create access barriers to those with learning disabilities, arguably within service and at a university teaching level.

The aim of the project was to create visual representations of how the person may be feeling depending on the psychological disorder.

Implementation

Allán Laville (Dean for Diversity and Inclusion) designed the psychoeducational materials for learning disabilities concept and then approached Charlotte Field to see whether she wanted to take part in the development of these materials. It was important to include Charlotte here as she is training as a PWP and has also studied Art.

Charlotte Field’s experience

The preliminary stage in the project involved doing rough sketches of how Depression, GAD and Panic would be represented. These were discussed and evaluated within an initial focus group with other students on the MSci Applied Psychology Cohort 5. The subsequent reflection and review of the feedback received enabled me to produce drawings that were more interactive as well as providing a more literal and figurative version of each disorder to help make things clearer. In doing so, making the drawings more accessible and appropriate for those with learning disabilities. I had the opportunity to review feedback on the completed drawings for a second time before the drawings were submitted.

Impact

Charlotte shares her view of the impact of completing this activity:

The materials here have been developed to add to the resources which could improve access for those with learning disabilities within Improving Access to Psychological Therapies (IAPT). As the rest of the MSci cohort and I are training as PWPs this was especially relevant to develop our clinical skills. These materials will be used in the training of future MSci cohorts – both within in-class role-plays and summative role-play assessments.

Reflections

Allán Laville reflections:

The student-staff partnership was key to the success of the project as we needed to ensure that the student voice was at the forefront. This was achieved in the work Charlotte completed herself as well as within the focus group and subsequent feedback on the psychoeducational materials over email. Based on this positive experience, we are keen to continue this approach to innovative T&L practices.

Charlotte Field’s reflections:

The student-staff partnership is of great importance as it builds collaboration and crucial links between students and staff. This is particularly important with projects such as this as it combines the knowledge and expertise from experienced staff members with the student’s current experience working within these services.

Follow up

In future, we will aim to develop similar psychoeducational materials for treatment interventions within Low Intensity Cognitive Behavioural Therapy. For example, materials for Behavioural Activation, which aims to increase individual’s routine, necessary and pleasurable activities to improve one’s mood.  This intervention would lend itself well to pictorial representations.

Using Psychological Techniques to get the most out of your Feedback

Zainab Abdulsattar (student – Research Assistant), Tamara Wiehe (staff – PWP Clinical Educator) and Dr Allán Laville, a.laville@reading.ac.uk, (Dean for D&I and Lecturer in Clinical Psychology). School of Psychology and CLS.

Overview

To help Part 3 MSci Applied Psychology students address the emotional aspect of engaging with and interpreting assessment feedback, we have created a Blackboard feedback tool, which draws on self-help strategies used in NHS Mental Health services. This was a TLDF funded project by CQSD and we reflect upon the usefulness of the tool in terms of helping students manage their assessment feedback in a more positive and productive way for both now and the future.

Objectives

  • To explore the barriers to interpreting and implementing feedback through the creation of a feedback-focused tool for Blackboard
  • To transfer aspects of NHS self-help strategies to the tool
  • To acknowledge the emotional aspect of addressing assessment feedback in Higher Education
  • To support students to engage effectively with feedback

Context

Assessment and feedback are continually rated as the lowest item on student surveys despite efforts from staff to address this. Whilst staff can certainly continue to improve on their practices surrounding providing feedback, our efforts turned to how we could improve student engagement in this area. Upon investigation of existing feedback-focused tools, it has become apparent that many do not acknowledge the emotional aspect of addressing assessment feedback. For example, the ‘Development Engagement with Feedback Toolkit (DEFT)’ has useful components like a glossary helping students with academic jargon, but it does not provide resources to help with feedback related stress. The aim was to address the emotional aspect of interpreting feedback in the form of a self-help tool.

Implementation

 Zainab Abdulsattar’s experience:

Firstly, we carried out a literature review on feedback in higher education and the use of self-help resources like cognitive restructuring within the NHS used to treat anxiety and depression. These ideas were taken to the student focus group: to gather students’ thoughts and opinions on what type of resource they would like to help them understand and use their feedback.

Considering ideas from the literature review and the focus group, we established the various components of the tool: purpose of feedback video, problem solving and cognitive restructuring techniques, reflective log and where to go for further support page. Then, we started the creation of our prototype Blackboard tool. At tool creation stage, we worked collaboratively with the TEL team (Maria, Matt and Jacqueline) to help format and launch the tool. Upon launch, students were given access to the tool via Blackboard and a survey to complete once they had explored and used the tool.

Impact

Our prototype Blackboard tool met the main objective of the project, to address the emotional aspect of the interpreting assessment feedback. The cognitive restructuring resource aimed to identify, challenge and re-balance students negative or stressful thoughts related to receiving feedback. Some students reported in the tool survey that they found this technique useful.

As well as this, the examples seemed to help students link their past experiences of not getting a good grade. Students also appreciated the interactive features like the video of the lecturer [addressing the fact that feedback is not a personal attack] and were looking forward to the tool being fully implemented during their next academic year. Overall, the student survey was positive with the addition of some suggestions such as making the tool smart phone friendly and altering the structure of the main page for ease of use.

Reflections

Zainab Abdulsattar’s reflections:

The success of the tool lied in the focus group and literature review contributions because the students’ focus group tool ideas helped to further contribute to the evidence-based self-help ideas gathered from the latter. Importantly, the hope is that the tool can act as an academic aid promoting and improving students’ independence in self-managing feedback in a more positive and productive way. Hopefully this will alleviate feedback-related stress for both now and the future in academic and work settings.

Follow up

In the future, we hope to expand the prototype tool into a more established feedback-focused tool. To make the tool even more use-friendly, we could consider improving the initial main contents page. For example, presenting the options like ‘I want to work on improving x’ then lead on to the appropriate self-help resource instead of simply starting with the resource options [e.g. problem solving, reflective log].

Developing and embedding electronic assessment overviews

Dr Allán Laville, a.laville@reading.ac.uk , Chloe Chessell and Tamara Wiehe

Overview

To develop our assessment practices, we created electronic assessment overviews for all assessments in Part 3 MSci Applied Psychology (Clinical) programme. Here we reflect on the benefits of completing this project via a student-staff partnership as well as the realised benefits for students.

Objectives

  • To create electronic assessment overviews for all 8 assessments in Part 3 MSci Applied Psychology (Clinical).
  • To create the overviews via a student-staff partnership with Chloe Chessell. Chloe is a current PhD student and previous MSci student.

Context

The activity was undertaken due to the complexity of the Part 3 assessments. In particular, the clinical competency assessments have many components and so, only providing an in-class overview has some limitations. The aim was for students to be able to review assessment overviews at any time via Blackboard.

Implementation

Allán Laville (Dean for Diversity and Inclusion) and Tamara Wiehe (MSci Clinical Educator) designed the electronic assessment overview concept and then approached Chloe Chessell to see whether she wanted to take part in the development of these overviews. It was important to include Chloe here as she has lived experience of completing the programme and therefore, can offer unique insight.

Chloe Chessell’s experience

The first stage in assisting with the development of electronic assessment resources for MSci Applied Psychology (Clinical) students involved reflecting upon the information my cohort was provided with during our Psychological Wellbeing Practitioner (PWP) training year. Specifically, this involved reflecting upon information about the assessments that I found particularly helpful; identifying any further information which would have benefitted my understanding of the assessments; and suggesting ways to best utilise screencasts to supplement written information about the assessments. After providing this information, I had the opportunity to review and provide feedback on the screencasts which had been developed by the Clinical Educators.

Impact

Chloe shares her view of the impact of completing this activity:

The screencasts that have been developed added to the information that I had as a student, as this format allows students to review assessment information in their own time, and at their own pace. Screencasts can also be revisited, which may help students to ensure they have met the marking criteria for a specific assessment. Furthermore, embedded videos/links to information to support the development of key writing skills (e.g. critical analysis skills) within these screencasts expand upon the information my cohort received, and will help students to develop these skills at the onset of their PWP training year.

Reflections

Staff reflections: The student-staff partnership was key to the success of the project as we needed to ensure that the student voice was at the forefront. The electronic assessment overviews have been well received by students and we are pleased with the results. Based on this positive experience, we now have a further 4 student-staff projects that are currently being completed and we hope to publish on the T&L Exchange in due course.

Chloe Chessell’s reflections:

I believe that utilising student-staff partnerships to aid course development is crucial, as it enables staff to learn from student’s experiences of receiving course information and their views for course development, whilst ensuring overall course requirements are met. Such partnerships also enable students to engage in their course at a higher level, allowing them to have a role in shaping the course around their needs and experiences.

Follow up

In future, we will aim to include interactive tasks within the screencasts, so students can engage in deep level learning (Marton, 1975). An example could be for students to complete a mind map based on the material that they have reviewed in the electronic assessment overview.

‘A-level Study Boost: Unseen Poetry and the Creative Process’: an online course

Rebecca Bullard, School of Literature and Languages, r.bullard@reading.ac.uk

Overview

‘A-level Study Boost: Unseen Poetry and the Creative Process’ is a two-week online course created by staff and students in the Department of English Literature and the Online Courses team, and hosted on the social learning platform, FutureLearn. It engages a global audience of learners in reading, writing, discussing, and enjoying poetry.

Objectives

The analysis of poetry, sometimes called ‘close reading’ or ‘practical criticism’, is a core skill for the study of English Literature. This course aims to develop this skill in pre- and post-A-level students of English Literature in ways that supplement teaching in schools and FE colleges. In doing so, it encourages students to make a successful transition from A-level to university-level study of English and Creative Writing.

Context

The Online Courses team at UoR approached colleagues in the Department of English Literature to work with them to develop a course that would connect students’ pre-university learning with their studies at UoR. The resulting online course develops learners’ subject-specific skills and gives them insight into what studying English and Creative Writing at university level might be like.

Implementation

Staff in the Online Courses team and Department of English Literature worked together to combine their diverse areas of expertise. Yen Tu, Digital Learning Producer, supported by Sarah Fleming, Assistant Digital Learning Producer, ensured that the course reflects best practice in the pedagogy of online social learning (Sharples 2018; Laudrillard 2014). Rebecca Bullard, as subject specialist, wrote the articles and designed tasks and activities to develop learners’ creative and critical skills.

It took about six months of intensive collaboration to produce the course materials. The first live run of the course took place over two weeks in December 2019. Rebecca and a team of student mentors engaged with learners on the FutureLearn platform throughout the live run to facilitate social learning and encourage completion of the course. The course content, feedback and statistics are currently being evaluated in order to measure impact and inform the next run.

Impact

The impact of the initial run of this course can be evaluated using the UoR Evaluation and Impact Framework (L1: Reach, L2: Reaction, L3: Learning, L4: Behaviour), using course analytics and comments from learners. Some participants gave permission for us to use their comments; where permission was not explicitly given, comments have been paraphrased:

L1: c. 1970 learners from over 100 countries enrolled on the first live run of this course. Comments on completing the course included the following:

L2: “I have always loved poetry but found some modern poems inaccessible. This course [has] shown me some ways to gain access.”

L3/4: “I’m a school teacher, having to teach unseen texts next year. This course has made me enjoy reading and dissecting poetry and I hope that I’ll succeed in inspiring my students to do the same.”

L3/4: One learner commented that the course has changed her perspective on poetry and that she is considering applying to UoR as a result of this course.

Reflections

The success of the course emerged out of the different kinds of collaboration that it involved and encouraged:

Staff-student: The course highlighted the expertise of UoR staff and students, The course videos showcase real teaching methods that are used in the Department of English Literature, and offer tangible evidence of the academic excellence and the outstanding learning experience that underpin the UoR T&L Strategy 2018-21. Current students were paid to work as mentors on the course, giving them confidence in their own expertise.

English Literature-Creative Writing: The course engages learners in both critical analysis and creative practice, reflecting research that indicates the close relationship between these different methods of approaching literary studies (Lockney and Proudfoot 2013).

Department of English Literature-Online Courses: Specialists in both areas drew on their different kinds of expertise to develop a structure, set of activities, tone and style for the course that encourage maximum engagement from learners.

Learner-Educator-Mentor: The social learning platform FutureLearn facilitates active, real-time conversations between Learners, Educators and Mentors, which strengthens and deepens their engagement with the course material.

Follow up

During 2020, further research will be undertaken to evaluate the impact of the course on particular learner groups. The Online Courses team will run a research study to evaluate how teachers (including those in WP areas) are using the course in their teaching. The Department of English Literature will evaluate the impact of the course on students enrolled on EN1PE: Poetry in English.

‘Unseen Poetry’ will be an exemplar for a new ‘A-Level Study Boost’ series which will be rolled out to other Schools across UoR.

Links

‘A-level Study Boost: Unseen Poetry and the Creative Process’: https://www.futurelearn.com/courses/a-level-study-unseen-poetry

References

Laudrillard, Diana. 2014. Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies. Abingdon: Routledge.

Lockney, K. & K. Proudfoot. 2013. ‘Writing the unseen poem: Can the writing of poetry help to support pupils’ engagement in the reading of poetry?’ English in Education 47:2, 147-162.

Sharples, M. 2018. The Pedagogy of FutureLearn: How our learners learn. https://about.futurelearn.com/research-insights/pedagogy-futurelearn-learners-learn

Piloting General Practice (GP) experiential learning for MPharm Year 3 students

Catherine Langran, Lecturer in Pharmacy Practice, School of Pharmacy

Daniel Mercer & Selen Morelle, MPharm Part 4 students, School of Pharmacy

Background

Throughout the Masters of Pharmacy degree (MPharm) students undertake experiential learning in hospital and community pharmacies. Experiential learning through placements is an important approach to teaching and learning; providing a safe learning environment for students, bridging the gap between theory and practice, and encouraging independent learning and reflective practice.

In 2016, the National Health Service (NHS) launched a programme “Building the General Practice Workforce” creating a new career pathway for pharmacists performing clinical tasks in a primary care setting. Over the past 3 years a steadily increasing number of pharmacists are pursuing this career option, and this is now a graduate opportunity for our MPharm students.  It is therefore crucial that Reading School of Pharmacy provides undergraduate students with an opportunity to experience this new role to give students more insight into their career options, encourage professional and personal development, and boost employability.

This collaborative partnership project piloted placements within GP practices for Part 3 pharmacy students to assess the students’ perceptions and evaluate the benefits and practicality of the placements.

Method

59 Part 3 students (46% of the cohort) attended a voluntary session in November 2018, prior to submitting the PLanT application. This session demonstrated a high level of student interest in this placement opportunity and also involved discussion of the practicalities (e.g. placement length, positioning within timetable, location) and perceived advantages of offering GP placements.

Following a successful bid to the PLanT fund, a second voluntary session was attended by 22 students who collaboratively worked with the project lead to determine the process of student recruitment and allocation to placements, define the placement learning outcomes, placement activities, evaluation methods and how to collect feedback. Subsequently, the two project lead students worked with the lead academic to construct an online application process, review student applications, finalise the student handbook and evaluate the student feedback.

The main objectives of this project were:

  • To evaluate the benefits of undertaking the GP placements for MPharm students.
  • To evaluate the placement provider’s feedback on the acceptability, practicality and scalability of providing placements for students.

Five GP practices were recruited to take part in the pilot, located in Reading and London. From April-June 2019, a total of 37 part 3 MPharm students completed a half to one day placement in one of five GP practices. Students predominately shadowed the GP Pharmacist within a clinic environment, and others had the opportunity to shadow GPs, nurses, physician associates and reception teams to provide a greater understanding on how General Practices function as a business.

Data was collected via student completion of online questionnaires pre and post GP placements to compare their:

  • Understanding of the role of GP pharmacists and how GP surgeries work (with 0=no knowledge to 10=complete knowledge)
  • Confidence building rapport and being empathetic when talking to patients (0=no confidence to 10=fully confident)

Students also decided that they would like to prepare and deliver a short 5-minute verbal presentation to their peers and the project group to share experiences and insights from their GP placement.

We also collected feedback from placement providers after completion of the placements.

Results

37 students completed the pre-placement questionnaire, and 30 students completed the post-placement questionnaire. Analysis of the data shows that the students who undertook the placement displayed a significant improvement in their understanding of the GP pharmacist role and the structure and running of a GP practice. A moderate increase in empathy and building rapport was also seen.

Students’ evaluation of the GP placements were overwhelmingly positive, highlighting improved knowledge of the role of GP pharmacists and having gained insight into their potential career choices:

 

In their peer presentations, students described key learning points:

–  An understanding of how different health care professionals skills can work together to offer best care to patients

– The value of observing pharmacist consultations with patients, and reflecting on how treatment decisions are made

– An increased understanding of the options available to them after graduation, enabling them to make a more informed career choice.

Feedback from placement providers showed they found hosting the placement enjoyable/rewarding, they felt the students were enthusiastic, and the organisation/communication from the university was excellent.

Limitations

Whilst the cohort of students who attended the placement days appear to have improved their understanding of GP pharmacy, we are aware that the students undertook the placements voluntarily. These students had a desire to explore the role of GP pharmacists and this implies that they had an interest in the area prior to undertaking the placement. Therefore, opinions may be favoured towards the role.

Impact

The student co-design element ensured this pilot delivered an authentic and valuable experience, with high levels of student engagement.

As a result of this pilot, funding has been secured from our Head of Department to implement GP placements for all part 3 students (cohort size 106) from December 2019. Working partnerships have been established with the 5 GP practices and this has now been expanded to 16 GP practices for 2019/2020. Embedding GP placements for our students will have a positive impact on the MPharm re-accreditation by our regulators the General Pharmaceutical council in March 2020.

There is the potential for this project to have a long term impact on NSS and employability which will be explore in June 2020. Offering these placement sets us apart from other Schools of Pharmacies, and is a key selling point in our new UCAS brochure.

‘How did I do?’ Finding new ways to describe the standards of foreign language performance. A follow-up project on the redesign of two marking schemes (DLC)

Rita Balestrini and Elisabeth Koenigshofer, School of Literature and Languages, r.balestrini@reading; e.koenigshofer@reading.ac.uk

Overview

Working in collaboration with two Final Year students, we designed two ‘flexible’, ‘minimalist’ rubric templates usable and adaptable across different languages and levels, to provide a basis for the creation of level specific, and potentially task specific, marking schemes where sub-dimensions can be added to the main dimensions. The two marking templates are being piloted this year in the DLC. The project will feature in this year’s TEF submission.

Objectives

Design, in partnership with two students, rubric templates for the evaluation and feedback of writing tasks and oral presentations in foreign languages which:

  • were adaptable across languages and levels of proficiency
  • provided a more inclusive and engaging form of feedback
  • responded to the analysis of student focus group discussions carried out for a previous TLDF-funded project

Context

As a follow-up to a teacher-learner collaborative appraisal of rubrics used in MLES, now DLC, we designed two marking templates in partnership with two Final Year students, who had participated in the focus groups from a previous project and were employed through Campus Jobs. ‘Acknowledgement of effort’, ‘encouragement’, ‘use of non-evaluative language’, ‘need for and, at the same time, distrust of, objective marking’ were recurrent themes that had emerged from the analysis of the focus group discussions and clearly appeared to cause anxiety for students.

Implementation

We organised a preliminary session to discuss these findings with the two student partners. We suggested some articles about ‘complexity theory’ as applied to second language learning, (Kramsch, 2012; Larsen-Freeman, 2012; 2015a; 2015b; 2017) with the aim of making our theoretical perspective explicit and transparent to them. A second meeting was devoted to planning collaboratively the structure of two marking schemes for writing and presentations. The two students worked independently to produce examples of standard descriptors which avoided the use of evaluative language and emphasised achievement rather than shortcomings. At a third meeting they presented and discussed their proposals with us. At the last meetings, we continued working to finalise the templates and the two visual learning charts they had suggested. Finally, the two students wrote a blog post to recount their experience of this collaborative work.

The two students appreciated our theoretical approach, felt that it was in tune with their own point of view and that it could support the enhancement of the assessment and marking process. They also found resources on their own, which they shared with us – including rubrics from other universities. They made valuable suggestions, gave us feedback on our ideas and helped us to find alternative terms when we were struggling to avoid the use of non-evaluative language for our descriptors. They also suggested making use of some visual elements in the marking and feedback schemes in order to increase immediateness and effectiveness.

Impact

The two marking templates are being piloted this year in the DLC. They were presented to colleagues over four sessions during which the ideas behind their design were explained and discussed. Further internal meetings are planned. These conversations, already begun with the previous TLDF-funded project on assessment and feedback, are contributing to the development of a shared discourse on assessment, which is informed by research and scholarship. The two templates have been designed in partnership with students to ensure accessibility and engagement with the assessment and feedback process. This is regarded as an outstanding practice in the ‘Assessment and feedback benchmarking tool’ produced by the National Union of Students and is likely to feature positively in this year’s TEF submission.

Reflections

Rubrics have become mainstream, especially within certain university subjects like Foreign Languages. They have been introduced to ensure accountability and transparency in marking practices, but they have also created new problems of their own by promoting a false sense of objectivity in marking and grading. The openness and unpredictability of complex performance in foreign languages and of the dynamic language learning process itself are not adequately reflected in the detailed descriptors of the marking and feedback schemes commonly used for the objective numerical evaluation of performance-based assessment in foreign languages. As emerged from the analysis of focus group discussions conducted in the department in 2017, the lack of understanding and engagement with the feedback provided by this type of rubrics can generate frustration in students. Working in partnership with them, rather than simply listening to their voices or seeing them as evaluators of their own experience, helped us to design minimalist and flexible marking templates, which make use of sensible and sensitive language, introduce visual elements to increase immediateness and effectiveness, leave a considerable amount of space for assessors to comment on different aspects of an individual performance and provide ‘feeding forward’ feedback. This type of ‘partnership’ can be challenging because it requires remaining open to unexpected outcomes. Whether it can bring about real change depends on how its outcomes are going to interact with the educational ecosystems in which it is embedded.

Follow up

The next stage of the project will involve colleagues in the DLC who will be using the two templates to contribute to the creation of a ‘bank’ of descriptors by sharing the ones they will develop to tailor the templates for specific stages of language development, language objectives, language tasks, or dimensions of student performance. We also intend to encourage colleagues teaching culture modules to consider using the basic structure of the templates to start designing marking schemes for the assessment of student performance in their modules.

Links

An account written by the two students partners involved in the project can be found here:

Working in partnership with our lecturers to redesign language marking schemes

The first stages of this ongoing project to enhance the process of assessing writing and speaking skills in the Department of Languages and Cultures (DLC, previously MLES) are described in the following blog entries:

National Union of Students 2017. The ‘Assessment and feedback benchmarking tool’ is available at:

http://tsep.org.uk/wp-content/uploads/2017/07/Assessment-and-feedback-benchmarking-tool.pdf

References

Bloxham, S. 2013. Building ‘standard’ frameworks. The role of guidance and feedback in supporting the achievement of learners. In S. Merry et al. (eds.) 2013. Reconceptualising feedback in Higher Education. Abingdon: Routledge.

Bloxham, S. and Boyd, P. 2007. Developing effective assessment in Higher Education. A practical guide. Maidenhead: McGraw-Hill International.

Bloxham, S., Boyd, P. and Orr, S. 2011. Mark my words: the role of assessment criteria in UK higher education grading practices. Studies in Higher Education 36 (6): 655-670.

Bloxham, S., den-Outer, B., Hudson J. and Price M. 2016. Let’s stop the pretence of consistent marking: exploring the multiple limitations of assessment criteria. Assessment in Higher Education 41 (3): 466-481.

Brooks, V. 2012. Marking as judgement. Research Papers in Education. 27 (1): 63-80.

Gottlieb, D. and Moroye, C. M. 2016. The perceptive imperative: Connoisseurship and the temptation of rubrics. Journal of Curriculum and Pedagogy 13 (2): 104-120.

HEA 2012. A Marked Improvement. Transforming assessment in HE. York: The Higher Education Academy.

Healey, M., Flint, A. and Harrington K. 2014. Engagement through partnership: students as partners in learning and teaching in higher education. York: The Higher Education Academy.

Kramsch, C. 2012. Why is everyone so excited about complexity theory in applied linguistics? Mélanges 33: 9-24.

Larsen-Freeman, D. 2012. The emancipation of the language learner. Studies in Second Language Learning and Teaching. 2(3): 297-309.

Larsen-Freeman, D. 2015a. Saying what we mean: Making a case for ‘language acquisition’ to become ‘language development’. Language Teaching 48 (4): 491-505.

Larsen-Freeman, L. 2015b. Complexity Theory. In VanPatten, B. and Williams, J. (eds.) 2015. Theories in Second Language Acquisition. An Introduction. New York: Routledge: 227-244.

Larsen-Freeman, D. 2017. Just learning. Language Teaching 50 (3): 425-437.

Merry, S., Price, M., Carless, D. and Taras, M. (eds.) 2013. Reconceptualising feedback in Higher Education. Abingdon: Routledge.

O’Donovan, B., Price, M. and Rust, C. 2004. Know what I mean? Enhancing student understanding of assessment standards and criteria. Teaching in Higher Education 9 (3): 325-335.

Price, M. 2005. Assessment standards: the role of communities of practice and the scholarship of assessment. Assessment & Evaluation in Higher Education 30 (3): 215-230.

Sadler, D. R. 2009. Indeterminacy in the use of preset criteria for assessment and grading. Assessment and evaluation in Higher Education 34 (2): 159-179.

Sadler, D. R. 2013. The futility of attempting to codify academic achievement standards. Higher Education 67 (3): 273-288.

Torrance, H. 2007. Assessment as learning? How the use of explicit learning objectives, assessment criteria and feedback in post-secondary education and training can come to dominate learning. Assessment in Education 14 (3): 281-294.

VanPatten & J. Williams (Eds.) 2015. Theories in Second Language Acquisition, 2nd edition. Routledge: 227-244.

Yorke, M. 2011. Summative assessment dealing. Dealing with the ‘Measurement Fallacy’. Studies in Higher Education 36 (3): 251-273.