Improving student assessment literacy & engaging students with rubrics

Dr. Allan Laville

School of Psychology & Clinical Languages Sciences

In this 14 minute video, early rubrics adopter Dr. Allan Laville shares how he and colleagues in Psychology have sought to improve student assessment literacy, and have successfully engaged students with their assessment rubrics by embedding analysis of them into their in-class teaching and by using screencasts, discussion boards and student partnership. Lots of useful ideas and advice – well worth a watch.

Promoting and Tracking Student Engagement on an Online Undergraduate Pre-sessional Course

Sarah Mattin: International Study and Language Institute

Overview

This case study outlines approaches to fostering an active learning environment on the University’s first fully online Undergraduate Pre-sessional Course which ran in Summer 2020 with 170 students. It reports staff and student feedback and reflects on how lessons learnt during the summer can inform ISLI’s continued online delivery this autumn term and beyond.

 

Objectives

  • To design and deliver an online Pre-sessional Course to meet the needs of 170 students studying remotely, mostly in China
  • To promote student engagement in learning activities in an online environment
  • To devise effective mechanisms for tracking student engagement and thus identify students who may require additional support

 

Context

The Pre-sessional Programme (PSE) is an English for Academic Purposes (EAP) and academic skills development programme for degree offer holders who require more study to meet the English Language requirements of their intended programme. The programme runs year-round and, in the summer, has separate UG and PG courses. We would usually expect to welcome around 700 students to the campus for the summer courses (June-September); in summer 2020 we took the courses fully online in response to the COVID crisis. This case study focuses on the Undergraduate Course.

 

Implementation

Due to the constraints of the time difference between the UK and China, where most students were based, we knew learning on the course would need to be largely asynchronous. However, we were keen to promote active learning and so adopted the following approaches:

  • Use of the online authoring tool Xerte to create interactive learning materials which enabled students to have immediate feedback on tasks.
  • Incorporation of asynchronous peer and student-teacher interaction into the course each week through scaffolded tasks for the Blackboard Discussion Boards.
  • Setting up of small study groups of 3-4 students within each class of 16 students. Each group had fortnightly tutorials with the teacher and were encouraged to use the group for independent peer support.
  • Live online interactive sessions which took a ‘flipped’ approach, so students came prepared to share and discuss their work on a set task and ask any questions.

In order to track engagement with the learning materials we used Blackboard Tests to create short (4-5 questions) ‘Stop & Check’ quizzes at regular intervals throughout the week. We used the Grade Centre to monitor completion of these. We also made use of other student engagement monitoring features of Blackboard, in particular the Retention Centre within Evaluation and Blackboard Course Reports which enable instructors to track a range of user activity.

 

Impact

Our tracking showed that most students were engaging with the tasks daily, as required. We were very quickly able to identify a small group of students who were not engaging as hoped and target additional communication and support to these students.

Student feedback demonstrated that students perceived improvements in their language ability across the four skills (reading, writing, speaking and listening) and this was confirmed by their results at the end of the course. Student outcomes were good with over 90% of students achieving the scores they needed to progress to their chose degree programme. This compares favourably with the progression rate for the on-campus course which has run in previous years.

Feedback from teachers on the learning materials was very positive. One teacher commented that ‘The videos and Xerte lessons were excellent. As a new teacher I felt the course was very clear and it has been the best summer course I have worked on’. Teachers highlighted Xerte, the Discussion Boards and the interactive sessions as strengths of the course.

The materials and overall design of the course have informed the Pre-sessional Course (PSE 1) which is running this Autumn Term.

 

Reflections

Overall, we designed and delivered a course which met our objectives. Some reflections on the tools and approaches we employed are as follows:

Xerte lessons: these were definitely a successful part of the course enabling us to provide interactive asynchronous learning materials with immediate feedback to students. We also found the Xerte lessons enabled us to make coherent ‘packages’ of smaller tasks helping us to keep the Blackboard site uncluttered and easy to navigate.

Discussion Boards: teacher feedback indicated that this was a part of the course some felt was an enhancement of the previous F2F delivery. Points we found were key to the success of Discussion Board tasks were:

  • Creation of a new thread for each task to keep threads a manageable size
  • Linking to the specific thread from the task using hyperlinks
  • Detailed and specific Discussion Board task instructions for students broken down into steps of making an initial post and responding to classmates’ posts with deadlines for each step
  • Teacher presence on the Discussion Board
  • Teacher feedback on group use of the Discussion Board in live sessions to reinforce the importance of peer interaction

Small study groups: these were a helpful element of the course and greater use could have been made of them. For example, one teacher developed a system of having a rotating ‘group leader’ who took responsibility for guiding the group through an assigned task each week. In the future we could incorporate this approach and build more independent group work into the asynchronous learning materials to reinforce the importance of collaboration and peer learning.

Live sessions: student feedback showed clearly that this was an aspect of the course they particularly valued. Both students and teachers felt there should be more live contact but that these do not need to be long sessions; even an additional 30 minutes a day would have made a difference. Teachers and students commented that Teams provided a more stable connection for students in China than Blackboard Collaborate.

Blackboard Tests and monitoring features of Blackboard: these were undoubtedly useful tools for monitoring student engagement. However, they generate a great deal of data which is not always easy to interpret ‘at a glance’ and provides a fairly superficial account of engagement. Most teachers ended up devising their own tracking systems in Excel which enabled them to identify and track performance on certain key tasks each week.

 

Follow up

Taking into account the feedback from this year, materials developed could be used in future to facilitate a flipped learning approach on the course with students studying on campus or remotely. This would address the calls for more teacher-student interaction and enable the course to respond flexibility to external events. Currently, we are applying lessons learnt from the summer to the delivery of our Pre-sessional and Academic English Programmes running this term.

 

Links

The Pre-sessional English and Academic English Programme webpages give more details about the Programmes

Pre-sessional: http://www.reading.ac.uk/ISLI/study-in-the-uk/isli-pre-sessional-english.aspx

Academic English Programme: http://www.reading.ac.uk/ISLI/enhancing-studies/isli-aep.aspx

Introducing group assessment to improve constructive alignment: impact on teacher and student

Daniela Standen, School Director of Teaching and Learning, ISLI  Alison Nicholson, Honorary Fellow, UoR

Overview

In summer 2018-19 Italian and French in Institution-wide Language Programme, piloted paired Oral exams. The impact of the change is explored below. Although discussed in the context of language assessment, the drivers for change, challenges and outcomes are relevant to any discipline intending to introduce more authentic and collaborative tasks in their assessment mix. Group assessments constitute around 4% of the University Assessment types (EMA data, academic year 2019-20).

Objectives

  • improve constructive alignment between the learning outcomes, the teaching methodology and the assessment process
  • for students to be more relaxed and produce more authentic and spontaneous language
  • make the assessment process more efficient, with the aim to reduce teacher workload

Context

IWLP provides credit-bearing language learning opportunities for students across the University. Around 1,000 students learn a language with IWLP at Reading.

The learning outcomes of the modules talk about the ability to communicate in the language.  The teaching methodology employed favours student–student interaction and collaboration.  In class, students work mostly in pairs or small groups. The exam format, on the other hand, was structured so that a student would interact with the teacher.

The exam was often the first time students would have spoken one-to-one with the teacher. The change in interaction pattern could be intimidating and tended to produce stilted Q&A sessions or interrogations, not communication.

Implementation

Who was affected by the change?

221 Students

8 Teachers

7 Modules

4 Proficiency Levels

2 Languages

What changed?

  • The interlocution pattern changed from teacher-student to student-student, reflecting the normal pattern of in-class interaction
  • The marking criteria changed, so that quality of interaction was better defined and carried higher weight
  • The marking process changed, teachers as well as students were paired. Instead of the examiner re-listening to all the oral exams in order to award a mark, the exams were double staffed. One teacher concentrated on running the exam and marking using holistic marking criteria and the second teacher listened and marked using analytic rating scales

Expected outcomes

  • Students to be more relaxed and produce more authentic and spontaneous language
  • Students to student interaction creates a more relaxed atmosphere
  • Students take longer speaking turns
  • Students use more features of interaction

(Hardi Prasetyo, 2018)

  • For there to be perceived issues of validity and fairness around ‘interlocutor effects’ i.e. how does the competence of the person I am speaking to affect my outcomes. (Galaczi & French, 2011)

 Mitigation

  • Homogeneous pairings, through class observation
  • Include monologic and dialogic assessment tasks
  • Planned teacher intervention
  • Inclusion of communicative and linguistic marking criteria
  • Pairing teachers as well as students, for more robust moderation

Impact

Methods of evaluation

Questionnaires were sent to 32 students who had experienced the previous exam format to enable comparison.  Response rate was 30%, 70% from students of Italian. Responses were consistent across the two languages.

8 Teachers provided verbal or written feedback.

 Students’ Questionnaire Results

Overall students’ feedback was positive.  Students recognised closer alignment between teaching and assessment, and that talking to another student was more natural. They also reported increased opportunities to practise and felt well prepared.

However, they did not feel that the new format improved their opportunity to demonstrate their learning or speaking to a student more relaxing.  The qualitative feedback tells us that this is due to anxieties around pairings.

Teachers’ Feedback

  • Language production was more spontaneous and authentic. One teacher commented ‘it was a much more authentic situation and students really helped each other to communicate’
  • Marking changed from a focus on listening for errors towards rewarding successful communication
  • Workload decreased by up to 30%, for the average student cohort and peaks and troughs of work were better distributed

Reflections

Overall, the impact on both teachers and students was positive. Student reported that they were well briefed and had greater opportunities to practise before the exam. Teachers reported a positive impact on workloads and on the students’ ability to demonstrate they were able to communicate in the language.

However, this was not reflected in the students’ feedback. There is a clear discrepancy in the teachers and students’ perception of how the new format allows students to showcase learning.

Despite mitigating action being taken, students also reported anxiety around ‘interlocutor effect’.  Race (2014) tells us that even when universities have put all possible measures in place to make assessment fair they often fail to communicate this appropriately to students. The next steps should therefore focus on engaging students to bridge this perception gap.

Follow-up

Follow up was planned for the 2019-20 academic cycle but could not take place due to the COVID-19 pandemic.

References

Galaczi & French, in Taylor, L. (ed.), (2011). Examining Speaking: Research and practice in assessing second language speaking. Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo, Dehli, Tokyo, Mexico City: CUP.

Fulcher, G. (2003). Testing Second Language Speaking. Ediburgh: Pearson.

Hardi Prasetyo, A. (2018). Paired Oral Tests: A literature review. LLT Journal: A Journal on Language and Language Teaching, 21(Suppl.), 105-110.

Race, P. (2014) Making Learning happen (3rd ed.), Los Angeles; London: Sage

Race, P. (2015) The lecturer’s toolkit : a practical guide to assessment, learning and teaching (4th ed.), London ; New York, NY : Routledge, Taylor & Francis Group

 

How ISLI’s Assessment Team created an online oral exam for the Test of English for Educational Purposes (TEEP)

Fiona Orel– International Study and Language Institute (ISLI)

 

Overview

ISLI’s Test of English for Educational Purposes (TEEP) is administered at the end of pre-sessional courses as a measure of students’ academic English proficiency. The speaking test has traditionally been an academic discussion between two students that is facilitated by an interlocutor and marked by an observer.

This case study outlines the process of creating a version of the TEEP speaking test for 1-1 online delivery.

Objectives

  • To create an online TEEP speaking test that could be administered at the beginning of June to 95 students
  • To ensure reliability and security of results
  • To support students and teachers with the transition

Context

The Pre-sessional English course 3 (PSE 3) started in April during the period of lockdown.  At the end of the course all students sit a TEEP test which includes a test of speaking skills. We realised that we wouldn’t be able to administer the usual two student + two teachers test given the constraints with technology and the changes in teaching and learning which reduced to a certain degree the students’ opportunities for oral interaction and that we would need to develop a new 1-1 test that maintained the validity and reliability of the original TEEP Speaking test.

Implementation

We had two main objectives: to create a valid online 1-1 speaking test, and to make sure that the technology we used to administer the test was simple and straight-forward for both teachers and students, and would have reasonably reliable connectivity in the regions where students were based (China, Middle East and UK).

The first thing we needed to do was to return to our test specifications – what exactly were we hoping to assess through the oral exam? The original face-to-face test had five criteria: overall communication, interaction, fluency, accuracy and range, and intelligibility. We knew that interaction had been impacted by the move online, but decided that the aspect of responding appropriately to others was a crucial aspect of interaction that needed to remain and included this in the ‘overall communication’ criteria. Recognising also that interlocutors would also need to be examiners, we worked on streamlining the criteria to remove redundancy and repetition and to ensure that each block contained the same type of description in the same order thereby making it easier for tutors to skim and recall.

We then worked out exactly what functions and skills in speaking that we wanted to test and how we could do that while mostly working with existing resources. We aligned with the original test specifications by testing students’ ability to:

  • Provide appropriate responses to questions and prompt
  • Describe experiences and things
  • Give and justify an opinion by, for example, stating an opinion, explaining causes and effects, comparing, evaluating.

The test format that enabled this was:

  • Part one: an interview with the student about their studies and experience of studying online
  • Part two: problem solving scenario: Students are introduced to a problem which the teacher screen shares with them and they are given three possible solutions to compare, evaluate and rank most to least effective
  • Part three: abstract discussion building on the talk given in part two

The final stage was trialling a platform to conduct the tests. We had considered Zoom due to its reliability but discounted it due to security concerns. BB Collaborate had connectivity issues in China so we decided to use Teams as connectivity was generally better and students and teachers were familiar with the platform as they had been using it for tutorials. Due to the spread of students over time zones, we decided to spread the speaking tests over three mornings finishing by 11:00 BST on each day. We kept the final slot on Friday free for all teachers to enable rescheduling of tests for any student experiencing issues with connectivity on the day.

Finally, we needed to help teachers and students prepare for the tests. For students, learning materials were produced with videos of a sample test, there was a well-attended webinar to introduce the format and requirements, and the recording of this webinar was made available to all students along with a document on their BB course. This instructed them what to do before test day and what to expect on test day.

The test format and procedures were introduced to teachers with instructions for tasks to do before the test, during the test, and after the test. There was also an examiner’s script prepared with integrated instructions and speech to standardise how the tests were administered. Each test was recorded to ensure security and to enable moderation. All students had to verify their identity at the start of the test. The test recording caused some problems as we knew that the video would have to be downloaded and deleted from Stream before anyone else or the student in the Team meeting who had been tested could access it. For this reason we allowed 40 minutes for each 20 minute interview as downloading was sometimes a lengthy process depending on internet speeds. We had 2 or 3 people available each day to pick up any problems such as a teacher being unwell or having tech issues, and/or a student experiencing problems. This worked well and on the first two days we did have to reschedule a number of tests, fortunately, all worked well on the final day. The teachers were fully committed and worked hard to put students at ease, informal feedback from students was the appreciation of an opportunity to talk 1-1 with a tutor, and tutors said that the test format allowed for plenty of evidence upon which to base a decision.

Impact

The test was successful overall and there were fewer technical issues than we had anticipated. Teachers and students were happy with it as an assessment measure and we were able to award valid and reliable grades.

Working together collaboratively with the teachers and the Programme Director was incredibly rewarding and meant that we had a wide resource base of talent and experience when we did run into any problems.

Reflections

Incredibly detailed planning, the sharing of information across Assessment and Pre-sessional Teams, and much appreciated support from the TEL team helped to make the test a success. Students and teachers had very clear and detailed instructions and knew exactly what was expected and how the tests would be conducted. The sharing of expertise across teams meant that problems were solved quickly and creatively, and it is good to see this practice becoming the norm.

We need to work on the issue of downloading and deleting the video after each test as this caused some anxiety for some teachers with slower internet connection. We also need to have more technical support available, especially on the first day. Most students had tested their equipment as instructed but some who hadn’t experienced issues. It would be even better if a similar activity could be built into the course so that teachers and students experience the test situation before the final test.

Follow up

ISLI’s Assessment Team is now preparing to administer the same tests to a much larger cohort of students at the end of August. We will apply the lessons learned during this administration and work to make the process easier for teachers.

The DEL Feedback Action Plan

Madeleine Davies, Cindy Becker and Michael Lyons- SLL

Overview

A feedback audit and consultation with the Student Impact Network revealed a set of practices DEL needs to amend. The research produced new student-facing physical and online posters, designed by a ‘Real Jobs’ student, to instruct students on finding their feedback online, and generated ‘marking checklists’ for staff to indicate what needs to be included in feedback and what needs to be avoided.

Objectives

  • To assess why students scored DEL poorly on feedback in NSS returns
  • To consult with students on types of feedback they considered useful
  • To brief colleagues on good practice feedback
  • To produce consistency (but not conformity) in terms of, for example, the amount of feedback provided, feedforward, full feedback for First Class work, etc.
  • To assess whether marking rubrics would help or hinder DEL feedback practice

Context

The ‘DEL Feedback Action Project’ addresses the persistent issue of depressed NSS responses to Department of English Literature assessment and feedback practices. The responses to questions in ‘teaching quality’ sections are favourable but the 2018 NSS revealed that, for English Studies, Reading is in the third quartile for the ’Assessment and Feedback’ section and the bottom quartile for question 8 (scoring 64% vs the 74% median score) and question 9 (scoring 70% vs the 77% median score).

In October 2018, DEL adopted eSFG. An EMA student survey undertaken in January 2019 polled 100 DEL students and found that, though students overwhelmingly supported the move to eSFG, complaints about the quality of DEL feedback persisted.

Implementation

Michael Lyons began the project with an audit of DEL feedback and identified a number of areas where the tone or content of feedback may need improving. This material was taken to the Student Impact Network which was shown anonymised samples of feedback. Students commented on it. This produced a set of indicators which became the basis of the ‘marking checklist’ for DEL staff. Simultaneously, DEL staff were asked to discuss feedback practice in ‘professional conversations’ for the annual Peer Review exercise. This ensured that the combined minds of the whole department were reflecting on this issue

Student consultation also revealed that many students struggle to find their feedback online. With this in mind, we collaborated with TEL to produce ‘maps to finding feedback’ for students. A ‘Real Jobs’ student designer converted this information into clear, readable posters which can be displayed online or anywhere in the University (the information is not DEL-specific). The posters will be of particular use for incoming students but our research also suggested that Part 3 students are often unaware of how to access feedback.

The results of the initial audit and consultation with students indicated where our feedback had been falling short. We wrote a summary of these finding for DEL HoD and DDTL.

Research into marking rubrics revealed that DEL marking would not be suited to using this feedback practice. This is because they can be inflexible and because DEL students resist ‘generic’ feedback.

Impact

The student-facing posters and staff-facing ‘marking checklist’ speak to two of the main issues with DEL feedback that were indicated by students. The latter will deter overly-brief, curt feedback and will prompt more feedforward and comment about specific areas of the essay (for example, the Introductory passage, the essay structure, referencing, grammar, use of secondary resources, etc).

With DEL staff now focused on the feedback issue, and with students equipped to access their feedback successfully, we are hoping to see a marked improvement in NSS scores in this area in 2020-21.

For ‘surprises’, see ‘Reflections’.

Reflections

The pressure on academic staff to mark significant amounts of work within tight deadlines can lead to potential unevenness in feedback. We are hoping that our research prompts DEL to streamline its assessment practice to enhance the quality and consistency of feedback and feedforward.

Students’ responses in the Student Impact Network also suggested that additional work is required on teaching students how to receive feedback. Over-sensitivity in some areas can produce negative scores. With this in mind, the project will terminate with an equivalent to the ‘marking checklist’ designed for students. This will remind students that feedback is anonymous, objective, and intended to pave the way to success.

Follow up

Monitoring NSS DEL feedback scores in the 2020-21 round, and polling students in the next session to ensure that they are now able to access their feedback.

Continuing to reflect on colleagues’ marking workload and the link between this and unconstructive feedback.

 

 

Developing psychoeducational materials for individuals with learning disabilities

Dr Allán Laville, a.laville@reading.ac.uk, (Dean for D&I and Lecturer in Clinical Psychology) and Charlotte Field (Research Assistant and student on MSci Applied Psychology)

Overview

To improve access to psychoeducational materials by addressing the diverse needs of those accessing Improving Access to Psychological Therapy (IAPT) services. We worked on creating materials that could be used to describe psychological disorders such as Depression and Generalized Anxiety Disorder (GAD) to those who have learning disabilities. Here we reflect upon the benefits of completing this project via a student- staff partnership as well as the potential benefits of using within IAPT.

Objectives

  • This project was funded by SPCLS Teaching & Learning Enhancement Fund and was to create psychoeducational materials suitable for those with learning disabilities that depict Depression, GAD and Panic Disorder.
  • To effectively utilise student and staff feedback in the creation of these materials.

Context

The above project was undertaken as within IAPT, Psychological Wellbeing Practitioners (PWPs) typically use materials that are text heavy when explaining psychological disorders. This can create access barriers to those with learning disabilities, arguably within service and at a university teaching level.

The aim of the project was to create visual representations of how the person may be feeling depending on the psychological disorder.

Implementation

Allán Laville (Dean for Diversity and Inclusion) designed the psychoeducational materials for learning disabilities concept and then approached Charlotte Field to see whether she wanted to take part in the development of these materials. It was important to include Charlotte here as she is training as a PWP and has also studied Art.

Charlotte Field’s experience

The preliminary stage in the project involved doing rough sketches of how Depression, GAD and Panic would be represented. These were discussed and evaluated within an initial focus group with other students on the MSci Applied Psychology Cohort 5. The subsequent reflection and review of the feedback received enabled me to produce drawings that were more interactive as well as providing a more literal and figurative version of each disorder to help make things clearer. In doing so, making the drawings more accessible and appropriate for those with learning disabilities. I had the opportunity to review feedback on the completed drawings for a second time before the drawings were submitted.

Impact

Charlotte shares her view of the impact of completing this activity:

The materials here have been developed to add to the resources which could improve access for those with learning disabilities within Improving Access to Psychological Therapies (IAPT). As the rest of the MSci cohort and I are training as PWPs this was especially relevant to develop our clinical skills. These materials will be used in the training of future MSci cohorts – both within in-class role-plays and summative role-play assessments.

Reflections

Allán Laville reflections:

The student-staff partnership was key to the success of the project as we needed to ensure that the student voice was at the forefront. This was achieved in the work Charlotte completed herself as well as within the focus group and subsequent feedback on the psychoeducational materials over email. Based on this positive experience, we are keen to continue this approach to innovative T&L practices.

Charlotte Field’s reflections:

The student-staff partnership is of great importance as it builds collaboration and crucial links between students and staff. This is particularly important with projects such as this as it combines the knowledge and expertise from experienced staff members with the student’s current experience working within these services.

Follow up

In future, we will aim to develop similar psychoeducational materials for treatment interventions within Low Intensity Cognitive Behavioural Therapy. For example, materials for Behavioural Activation, which aims to increase individual’s routine, necessary and pleasurable activities to improve one’s mood.  This intervention would lend itself well to pictorial representations.

Using Psychological Techniques to get the most out of your Feedback

Zainab Abdulsattar (student – Research Assistant), Tamara Wiehe (staff – PWP Clinical Educator) and Dr Allán Laville, a.laville@reading.ac.uk, (Dean for D&I and Lecturer in Clinical Psychology). School of Psychology and CLS.

Overview

To help Part 3 MSci Applied Psychology students address the emotional aspect of engaging with and interpreting assessment feedback, we have created a Blackboard feedback tool, which draws on self-help strategies used in NHS Mental Health services. This was a TLDF funded project by CQSD and we reflect upon the usefulness of the tool in terms of helping students manage their assessment feedback in a more positive and productive way for both now and the future.

Objectives

  • To explore the barriers to interpreting and implementing feedback through the creation of a feedback-focused tool for Blackboard
  • To transfer aspects of NHS self-help strategies to the tool
  • To acknowledge the emotional aspect of addressing assessment feedback in Higher Education
  • To support students to engage effectively with feedback

Context

Assessment and feedback are continually rated as the lowest item on student surveys despite efforts from staff to address this. Whilst staff can certainly continue to improve on their practices surrounding providing feedback, our efforts turned to how we could improve student engagement in this area. Upon investigation of existing feedback-focused tools, it has become apparent that many do not acknowledge the emotional aspect of addressing assessment feedback. For example, the ‘Development Engagement with Feedback Toolkit (DEFT)’ has useful components like a glossary helping students with academic jargon, but it does not provide resources to help with feedback related stress. The aim was to address the emotional aspect of interpreting feedback in the form of a self-help tool.

Implementation

 Zainab Abdulsattar’s experience:

Firstly, we carried out a literature review on feedback in higher education and the use of self-help resources like cognitive restructuring within the NHS used to treat anxiety and depression. These ideas were taken to the student focus group: to gather students’ thoughts and opinions on what type of resource they would like to help them understand and use their feedback.

Considering ideas from the literature review and the focus group, we established the various components of the tool: purpose of feedback video, problem solving and cognitive restructuring techniques, reflective log and where to go for further support page. Then, we started the creation of our prototype Blackboard tool. At tool creation stage, we worked collaboratively with the TEL team (Maria, Matt and Jacqueline) to help format and launch the tool. Upon launch, students were given access to the tool via Blackboard and a survey to complete once they had explored and used the tool.

Impact

Our prototype Blackboard tool met the main objective of the project, to address the emotional aspect of the interpreting assessment feedback. The cognitive restructuring resource aimed to identify, challenge and re-balance students negative or stressful thoughts related to receiving feedback. Some students reported in the tool survey that they found this technique useful.

As well as this, the examples seemed to help students link their past experiences of not getting a good grade. Students also appreciated the interactive features like the video of the lecturer [addressing the fact that feedback is not a personal attack] and were looking forward to the tool being fully implemented during their next academic year. Overall, the student survey was positive with the addition of some suggestions such as making the tool smart phone friendly and altering the structure of the main page for ease of use.

Reflections

Zainab Abdulsattar’s reflections:

The success of the tool lied in the focus group and literature review contributions because the students’ focus group tool ideas helped to further contribute to the evidence-based self-help ideas gathered from the latter. Importantly, the hope is that the tool can act as an academic aid promoting and improving students’ independence in self-managing feedback in a more positive and productive way. Hopefully this will alleviate feedback-related stress for both now and the future in academic and work settings.

Follow up

In the future, we hope to expand the prototype tool into a more established feedback-focused tool. To make the tool even more use-friendly, we could consider improving the initial main contents page. For example, presenting the options like ‘I want to work on improving x’ then lead on to the appropriate self-help resource instead of simply starting with the resource options [e.g. problem solving, reflective log].

Developing and embedding electronic assessment overviews

Dr Allán Laville, a.laville@reading.ac.uk , Chloe Chessell and Tamara Wiehe

Overview

To develop our assessment practices, we created electronic assessment overviews for all assessments in Part 3 MSci Applied Psychology (Clinical) programme. Here we reflect on the benefits of completing this project via a student-staff partnership as well as the realised benefits for students.

Objectives

  • To create electronic assessment overviews for all 8 assessments in Part 3 MSci Applied Psychology (Clinical).
  • To create the overviews via a student-staff partnership with Chloe Chessell. Chloe is a current PhD student and previous MSci student.

Context

The activity was undertaken due to the complexity of the Part 3 assessments. In particular, the clinical competency assessments have many components and so, only providing an in-class overview has some limitations. The aim was for students to be able to review assessment overviews at any time via Blackboard.

Implementation

Allán Laville (Dean for Diversity and Inclusion) and Tamara Wiehe (MSci Clinical Educator) designed the electronic assessment overview concept and then approached Chloe Chessell to see whether she wanted to take part in the development of these overviews. It was important to include Chloe here as she has lived experience of completing the programme and therefore, can offer unique insight.

Chloe Chessell’s experience

The first stage in assisting with the development of electronic assessment resources for MSci Applied Psychology (Clinical) students involved reflecting upon the information my cohort was provided with during our Psychological Wellbeing Practitioner (PWP) training year. Specifically, this involved reflecting upon information about the assessments that I found particularly helpful; identifying any further information which would have benefitted my understanding of the assessments; and suggesting ways to best utilise screencasts to supplement written information about the assessments. After providing this information, I had the opportunity to review and provide feedback on the screencasts which had been developed by the Clinical Educators.

Impact

Chloe shares her view of the impact of completing this activity:

The screencasts that have been developed added to the information that I had as a student, as this format allows students to review assessment information in their own time, and at their own pace. Screencasts can also be revisited, which may help students to ensure they have met the marking criteria for a specific assessment. Furthermore, embedded videos/links to information to support the development of key writing skills (e.g. critical analysis skills) within these screencasts expand upon the information my cohort received, and will help students to develop these skills at the onset of their PWP training year.

Reflections

Staff reflections: The student-staff partnership was key to the success of the project as we needed to ensure that the student voice was at the forefront. The electronic assessment overviews have been well received by students and we are pleased with the results. Based on this positive experience, we now have a further 4 student-staff projects that are currently being completed and we hope to publish on the T&L Exchange in due course.

Chloe Chessell’s reflections:

I believe that utilising student-staff partnerships to aid course development is crucial, as it enables staff to learn from student’s experiences of receiving course information and their views for course development, whilst ensuring overall course requirements are met. Such partnerships also enable students to engage in their course at a higher level, allowing them to have a role in shaping the course around their needs and experiences.

Follow up

In future, we will aim to include interactive tasks within the screencasts, so students can engage in deep level learning (Marton, 1975). An example could be for students to complete a mind map based on the material that they have reviewed in the electronic assessment overview.

Clinical skills development: using controlled condition assessment to develop behavioural competence aligned to Miller’s pyramid

Kat Hall,  School of Chemistry, Food and Pharmacy, k.a.hall@reading.ac.uk

Overview

The Centre for Inter-Professional Postgraduate Education and Training (CIPPET) provide PGT training for healthcare professionals through a flexible Masters programme built around blended learning modules alongside workplace-based learning and assessment.  This project aimed to evolve the department’s approach to delivering one of our clinical skills workshops which sits within a larger 60 credit module.  The impact was shown via positive student and staff feedback, as well as interest to develop a standalone module for continuing further learning in advanced clinical skills.

Objectives

The aim of this project was to use controlled condition assessment approaches to develop behavioural competence at the higher levels of Miller’s pyramid of clinical competence 1.

Miller’s Pyramid of Clinical Competence

The objectives included:

  1. engage students in enquiry by promoting competence at higher levels of Miller’s pyramid
  2. develop highly employable graduates by identifying appropriate skills to teach
  3. evolve the workshop design by using innovative methods
  4. recruit expert clinical practitioners to support academic staff

Context

Health Education England are promoting a national strategy to increase the clinical skills training provided to pharmacists, therefore this project aimed to evolve the department’s approach to delivering this workshop.  The current module design contained a workshop on clinical skills, but it was loosely designed as a large group exercise which was delivered slightly differently for each cohort.  This prevented students from fully embedding their learning through opportunities to practise skills in alongside controlled formative assessment.

Implementation

Equipment purchase: As part of this project matched funding was received from the School to support the purchase of simulation equipment which meant a range a clinical skills teaching tools could be utilised in the workshops.  This step was undertaking collaboratively with the physician associate programme to share learning and support meeting objective 2 across the School.

Workshop design: the workshops were redesigned by the module convenor, Sue Slade, to focus on specific aspects of clinical skills that small groups could focus on with a facilitator.  The facilitators were supported to embed the clinical skills equipment within the activities therefore promoting students in active learning activities.  The equipment allowed students the opportunity to simulate the skills test to identify if they could demonstrate competence at the Knows How and Shows How level of Miller’s Pyramid of Clinical Competence.  Where possible the workshop stations were facilitated by practising clinical practitioners.  This step was focused on meeting objectives 1, 2, 3 and 4.

Workbook design: a workbook was produced that students could use to identify core clinical skills they required in their scope of practice and thus needed to practise in the workshop and further in their workplace-based learning.  This scaffolding supported their transition to the Does level of Miller’s Pyramid of Clinical Competence.  This step was focused on meeting objectives 1 and 3.

Impact

All four objectives were met and have since been mapped to the principles of Curriculum Framework to provide evidence of their impact.

Mastery of the discipline / discipline based / contextual: this project has supported the academic team to redesign the workshop around the evolving baseline core knowledge and skills required of students.  Doing this collaboratively between programme teams ensures it is fit for purpose.

Personal effectiveness and self-awareness / diverse and inclusive: the positive staff and student feedback received reflects that the workshop provides a better environment for student learning, enabling them to reflect on their experiences and take their learning back to their workplace more easily.

Learning cycle: the student feedback has shown that they want more of this type of training and so the team have designed a new stand-alone module to facilitate extending the impact of increasingly advanced clinical skills training to a wider student cohort.

Reflections

What went well? The purchase of the equipment and redesigning the workshop was a relatively simple task for an engaged team, and low effort for the potential return in improved experience.  By having one lead for the workshop, whilst another wrote the workbook and purchased the equipment, this ensured that staff across the team could contribute as change champions.  Recruitment for an advanced nurse practitioner to support the team more broadly was completed quickly and provided support and guidance across the year.

What did not go as well?  Whilst the purchase of the equipment and workshop redesign was relatively simple, encouraging clinical practitioners to engage with the workshop proved much harder.  We were unable to recruit consistent clinical support which made it harder to fully embed the project aims in a routine approach to teaching the workshop.  We considered using the expertise of the physician associate programme team but, as anticipated, timetabling made it impossible to coordinate the staffing needs.

Reflections: The success of the project lay in having the School engaged in supporting the objectives and the programme team invested in improving the workshop.  Focusing this project on a small part of the module meant it remained achievable to complete one cycle of change to deliver initial positive outcomes whilst planning for the following cycles of change needed to fully embed the objectives into routine practice.

Follow up

In planning the next series of workshops, we plan to draw more widely on the University alumni from the physician associate programme to continue the collaborative approach and attract clinical practitioners more willing to support us who are less constrained by timetables and clinical activities.

Based on student and staff feedback there is clearly a desire for more teaching and learning of this approach and being able to launch a new standalone module in 2020 is a successful output of this project.

Links and References

Miller, G.E. (1990). The assessment of clinical skills/competence/performance. Acad Med, 65(9):S63-7.

Electronic Management of Assessment: Creation of an e-Portfolio for PWP training programmes

Tamara Wiehe, Charlotte Allard & Hayley Scott (PWP Clinical Educators)

Charlie Waller Institute; School of Psychology and Clinical Language

Overview

In line with the University’s transition to Electronic Management of Assessment (EMA), we set out to create an electronic Portfolio (e-Portfolio) for use on our Psychological Well-being Practitioner (PWP) training programmes to replace an existing hard-copy format. The project spanned almost 1 year (October 2018- September 2019) as we took the time to consider the implications on students, supervisors in our IAPT NHS services, University administrators and markers. Working closely with the Technology Enhanced Learning (TEL) team led us to a viable solution that has been launched with our new cohorts from September 2019.

Image of portfolio template cover sheet

Objectives

  • Create an electronic Portfolio in line with EMA that overcomes existing issues and improves the experience for students, NHS supervisors, administrators and markers.
  • Work collaboratively with our all key stakeholders to ensure that the new format satisfies their various needs.

Context

A national requirement for PWPs is to complete a competency-based assessment in the form of a Portfolio that spans across their three modules of their training. Our students are employed by NHS services across the South of England and many live close to their service rather than the University.

The issue? The previous hard-copy format meant that students spent time and money printing their work and travelling to the University to submit/re-submit it. University administrators and markers reported issues with transporting the folders to markers and storing them, especially with the larger cohorts.

The solution… To resolve these issues by transitioning to an electronic version of the Portfolio.

Implementation

  1. October 2018: An initial meeting with TEL was held in order to discuss the practicalities of an online Portfolio submission.
  2. October 2018 – March 2019: TEL created several prototypes of options for submission via Blackboard including the use of the journal tool and a zip file. Due to practicalities, the course team decided on a single-file word document template.
  3. April – May 2019: Student focus groups were conducted with both programmes (undergraduate and postgraduate) where the same assessment sits to gain their feedback with the potential solution we had created. Using the outcomes of the focus groups and staff meetings, it was unanimously agreed that the proposed solution was a viable option for use with our future cohorts.
  4. June 2019: TEL delivered a training session for staff and admin to become familiar with the process from both student and staff perspective. TEL also created a guidance document for administrators on how to set up the assignment on Blackboard.
  5. July – August 2019: Materials including the template and rubrics were amended and formatted in order to meet requirements for online submission for both MSci and PWP courses. Resources were also created for students to access on Blackboard such as screen casts on how to access, utilise and submit the Portfolio using the electronic format; the aim of this is to improve accessibility for all students participating on the course.
  6. September 2019: Our IAPT services were notified of the changes as the supervisors there are responsible for reviewing and ‘signing off’ on the student’s performance before the Portfolio is submitted to the University for a final check.

Image of 'how to' screen cast resources on Blackboard

Impact

Thus far, the project has achieved the objectives it set out to. The template for submission is now available for students to complete throughout their training course. This will modernise the submission process and be less burdensome for the students, supervisors, administrators and markers.

Image of the new portfolio process

The students in the focus group reported that this would significantly simplify the process and relieve the barriers they often reported with completing and submitting the Portfolio. Currently, there have not been any unexpected outcomes with the development of the Portfolio. However, we aim to review the process with the first online Portfolio submission in June 2020.

Reflections

Upon reflection, the development of the online Portfolio has so far been a success. Following student feedback, we listened to what would improve their experience of completing the Portfolio. From this we developed an online Portfolio, meeting the requirements across two BPS accredited courses which will be used for future cohorts of students.

Additionally, the collaboration between staff, students and the TEL team, has led to improved communication across teams with new ideas shared; this is something we have continued to incorporate into our teaching and learning projects.

An area to develop for the future, would be to utilise a specific Portfolio software. Initially, we wanted to use a journal tool on Blackboard, however, it was not suitable to meet the needs of the course (most notably exporting the submission and mark sheet to external parties). We will continue to review these options and will continue to gain feedback from future cohorts.