Learning to Interpret and Assess Complex and Incomplete Environmental Data

Andrew Wade a.j.wade@reading.ac.uk

Department of Geography and Environmental Sciences

Overview

Field work is well known to improve student confidence and enhance skills and knowledge, yet there is evidence for a decline in field work in Secondary Education, especially amongst A-level Geography students. This is problematic as students are entering Geography and Environmental Science degree programmes with reduced skills and confidence around field-based data collection and interpretation, and this appears to be leading to an apprehension around data collection for dissertations. A simple field-based practical where 47 Part 2 Geography and Environmental Science students tested their own hypotheses about factors that control water infiltration into soils was developed. Improved confidence and appreciation of critical thinking around environmental data was reported in a survey of the student experience. Student coursework demonstrated that attainment was very good, and that skills and critical thinking can be recovered and enhanced with relatively simple, low-cost field-based practical classes that can be readily embedded to scaffold subsequent modules, including the dissertation.

Context

The importance of field work is well established in Geography and Environmental Science as a means of active and peer-to-peer learning. However, students appear to have little confidence in designing their own field work for hypotheses testing when they arrive for Part 1, probably due to a decline in field work in Secondary Education (Kinder 2016, Lambert and Reiss 2014). Within the Geography and Environmental Science programmes, there is a part two, 20 credit ‘Research Training’ module that develops the same skills. However, this research training module and the dissertation are seen by the students as being of high risk in that they perceive a low mark will have a significant negative impact on the overall degree classification. Consequently, students are seemingly risk adverse around field-based projects. The idea here is to make field-based training more commonplace throughout multiple modules through inclusion of relatively simple practical training, so that hypotheses testing, critical thinking and confidence with ‘messy’ environmental data become intuitive and students are at ease with these concepts. In parallel, GES module cohorts have increased in recent years and this is an additional reason to develop simple, low-cost practical classes.

Objectives

The aim of the project was to determine if a simple, field-based practical would help boost student confidence around field data collection and interpretation, and hypotheses testing. The objective was to give the students a safe and supportive environment in which to develop their own hypotheses and method for field data collection, and to learn to interpret often ‘messy’ and ‘complex’ environmental data.

Figure 1: The practical class took place on the hill-slope on campus between the Atmospheric Observatory and Whiteknights Lake on the 28 October 2019 over 4 hours in total.

 

Figure 2: Students used a Decagon Devices Mini-Disc Infiltrometer to measure unsaturated hydraulic conductivity to test their own hypotheses about the factors controlling infiltration

Implementation

A practical was designed where 47 Part 2 students, working in groups of four or five, developed their own hypotheses around the factors controlling rainfall infiltration on a hill-slope in the class room following an in-class briefing, and then tested these hypotheses in the field using Mini Disc infiltrometers (Figs. 1, 2 and 3). There was a further follow-up session where each student spent two hours processing the data collected and was briefed on the coursework write-up.

Figure 3: The students tested hypotheses around distance from the lake, vegetation and soil type, soil moisture and soil compaction. Each student group spent two hours in the field.

Impact

Of 40 students who responded to an on-line survey:

  • 37 agreed the practical helped develop their critical thinking skills around complex and incomplete environmental data;
  • 36 agreed they were now better able to deal with uncertainty in field-based measurements;
    and 38 feel more confident working in the field.

Student quotes included:

  • “The practical was very useful in helping to understand the processes happening as well as being more confident in using the equipment.”
  • “I thought the practical was good as it was another way to process information which tends to work better for me, doing and seeing how it works allows me to gain a higher understanding in the processes”

The majority of students gained first class and upper second-class marks for the project write-up and the reports submitted demonstrated good critical thinking skills in the interpretation of the infiltration measurements. There has been a noticeable increase in the number of students opting for hydrology-based dissertations.

Reflections

Confidence and critical thinking skills can be enhanced with relatively simple, low-cost field-based practicals that scaffold subsequent modules including Research Training for Geographers and Environmental Science, and the dissertation, and focus on hypotheses testing in addition to knowledge acquisition. Each student spent 2 hours in the field on campus and 2 hours processing their data, with further time on the coursework write-up. This seems a reasonable investment in time given the benefits in confidence, skills and knowledge. Embedding such practicals should not replace the larger skills-based modules, such as Research Training, nor should such practical classes replace entirely those that focus more on knowledge acquisition, but these practical classes, where students explore their own ideas, appear to be a useful means to boost student confidence and critical thinking skills at an early stage. The practical was also an excellent means of encouraging peer to peer interaction and learning, and this and similar practical classes have good potential for the integration of home and NUIST students.

Follow up

Embed similar practical classes in part one modules to build confidence at the outset of the degree programme and, at part three, to further enable integration of home and NUIST students.

Links and References

Kinder A. 2016. Geography: The future of fieldwork in schools. Online: http://www.sec-ed.co.uk/best-practice/geography-the-future-of-fieldwork-in-schools/ (Last accessed: 03 Jan 2020).

Lambert D and Reiss MJ. 2014, The place of fieldwork in geography and science qualifications, Institute of Education, University of London. ISBN: 978-1-78277-095-4. pp. 20

The impact of COVID upon practical classes in Part 1 chemistry – an opportunity to redevelop a core module

Philippa Cranwell p.b.cranwell@reading.ac.uk, Jenny Eyley, Jessica Gusthart, Kevin Lovelock and Michael Piperakis

Overview

This article outlines a re-design that was undertaken for the Part 1 autumn/spring chemistry module, CH1PRA, which services approximately 45 students per year. All students complete practical work over 20 weeks of the year. There are four blocks of five weeks of practical work in rotation (introductory, inorganic, organic and physical) and students spend one afternoon (4 hours) in the laboratory per week. The re-design was partly due to COVID, as we were forced to critically look at the experiments the students completed to ensure that the practical skills students developed during the COVID pandemic were relevant for Part 2 and beyond, and to ensure that the assessments students completed could also be stand-alone exercises if COVID prevented the completion of practical work. COVID actually provided us with an opportunity to re-invigorate the course and critically appraise whether the skills that students were developing, and how they were assessed, were still relevant for employers and later study.

Objectives

• Redesign CH1PRA so it was COVID-safe and fulfilled strict accreditation criteria.
• Redesign the experiments so as many students as possible could complete practical work by converting some experiments so they were suitable for completion on the open bench to maximise laboratory capacity
• Redesign assessments so if students missed sessions due to COVID they could still collect credit
• Minimise assessment load on academic staff and students
• Move to a more skills-based assessment paradigm, away from the traditional laboratory report.

Context

As mentioned earlier, the COVID pandemic led to significant difficulties in the provision of a practical class due to restrictions on the number of students allowed within the laboratory; 12 students in the fumehoods and 12 students on the open bench (rather than up to 74 students all using fumehoods previously). Prior to the redesign, each student completed four or five assessments per 5-week block and all of the assessments related to a laboratory-based experiment. In addition, the majority of the assessments required students to complete a pro-forma or a technical report. We noticed that the pro-formas did not encourage students to engage with the experiments as we intended, therefore execution of the experiment was passive. The technical reports placed a significant marking burden upon the academic staff and each rotation had different requirements for the content of the report, leading to confusion and frustration among the students. The reliance of the assessments upon completion of a practical experiment was also deemed high-risk with the advent of COVID, therefore we had to re-think our assessment and practical experiment regime.

Implementation

In most cases, the COVID-safe bench experiments were adapted from existing procedures, allowing processing of 24 students per week (12 on the bench and 12 in the fumehood), with students completing two practical sessions every five weeks. This meant that technical staff did not have to familiarise themselves with new experimental procedures while implementing COVID guidelines. In addition, three online exercises per rotation were developed, requiring the same amount of time as the practical class to complete therefore fulfilling our accreditation requirements. The majority of assessments were linked to the ‘online practicals’, with opportunities for feedback during online drop-in sessions. This meant that if a student had to self-isolate they could still complete the assessments within the deadline, reducing the likelihood of ECF submissions and ensuring all Learning Outcomes would still be met. To reduce assessment burden on staff and students, each 5-week block had three assessment points and where possible one of these assessments was marked automatically, e.g. using a Blackboard quiz. The assessments themselves were designed to be more skills-based, developing the softer skills students would require upon employment or during a placement. To encourage active learning, the use of reflection was embedded into the assessment regime; it was hoped that by critically appraising performance in the laboratory students would remember the skills and techniques that they had learnt better rather than the “see, do, forget” mentality that is prevalent within practical classes.

Examples of assessments include: undertaking data analysis, focussing on clear presentation of data; critical self-reflection of the skills developed during a practical class i.e. “what went well”, “what didn’t go so well”, “what would I do differently?”; critically engaging with a published scientific procedure; and giving a three-minute presentation about a practical scientific technique commonly-encountered in the laboratory.

Impact

Mid-module evaluation was completed using an online form, providing some useful feedback that will be used to improve the student experience next term. The majority of students agreed, or strongly agreed, that staff were friendly and approachable, face-to-face practicals were useful and enjoyable, the course was well-run and the supporting materials were useful. This was heartening to read, as it meant that the adjustments that we had to make to the delivery of laboratory based practicals did not have a negative impact upon the students’ experience and that the re-design was, for the most part, working well. Staff enjoyed marking the varied assessments and the workload was significantly reduced by using Blackboard functionality.

Reflections

To claim that all is perfect with this redesign would be disingenuous, and there was a slight disconnect between what we expected students to achieve from the online practicals and what students were achieving. A number of the students polled disliked the online practical work, with the main reason being that the assessment requirements were unclear. We have addressed by providing additional videos explicitly outlining expectations for the assessments, and ensuring that all students are aware of the drop-in sessions. In addition, we amended the assessments so they are aligned more closely with the face-to-face practical sessions giving students opportunity for informal feedback during the practical class.

In summary, we are happy that the assessments are now more varied and provide students with the skills they will need throughout their degree and upon graduation. In addition, the assessment burden on staff and students has been reduced. Looking forward, we will now consider the experiments themselves and in 2021/22 we will extend the number of hours of practical work that Part 1 students complete and further embed our skill-based approach into the programme.

Follow up

 

Links and References

Improving student assessment literacy & engaging students with rubrics

Dr. Allan Laville

School of Psychology & Clinical Languages Sciences

In this 14 minute video, early rubrics adopter Dr. Allan Laville shares how he and colleagues in Psychology have sought to improve student assessment literacy, and have successfully engaged students with their assessment rubrics by embedding analysis of them into their in-class teaching and by using screencasts, discussion boards and student partnership. Lots of useful ideas and advice – well worth a watch.

Promoting and Tracking Student Engagement on an Online Undergraduate Pre-sessional Course

Sarah Mattin: International Study and Language Institute

Overview

This case study outlines approaches to fostering an active learning environment on the University’s first fully online Undergraduate Pre-sessional Course which ran in Summer 2020 with 170 students. It reports staff and student feedback and reflects on how lessons learnt during the summer can inform ISLI’s continued online delivery this autumn term and beyond.

 

Objectives

  • To design and deliver an online Pre-sessional Course to meet the needs of 170 students studying remotely, mostly in China
  • To promote student engagement in learning activities in an online environment
  • To devise effective mechanisms for tracking student engagement and thus identify students who may require additional support

 

Context

The Pre-sessional Programme (PSE) is an English for Academic Purposes (EAP) and academic skills development programme for degree offer holders who require more study to meet the English Language requirements of their intended programme. The programme runs year-round and, in the summer, has separate UG and PG courses. We would usually expect to welcome around 700 students to the campus for the summer courses (June-September); in summer 2020 we took the courses fully online in response to the COVID crisis. This case study focuses on the Undergraduate Course.

 

Implementation

Due to the constraints of the time difference between the UK and China, where most students were based, we knew learning on the course would need to be largely asynchronous. However, we were keen to promote active learning and so adopted the following approaches:

  • Use of the online authoring tool Xerte to create interactive learning materials which enabled students to have immediate feedback on tasks.
  • Incorporation of asynchronous peer and student-teacher interaction into the course each week through scaffolded tasks for the Blackboard Discussion Boards.
  • Setting up of small study groups of 3-4 students within each class of 16 students. Each group had fortnightly tutorials with the teacher and were encouraged to use the group for independent peer support.
  • Live online interactive sessions which took a ‘flipped’ approach, so students came prepared to share and discuss their work on a set task and ask any questions.

In order to track engagement with the learning materials we used Blackboard Tests to create short (4-5 questions) ‘Stop & Check’ quizzes at regular intervals throughout the week. We used the Grade Centre to monitor completion of these. We also made use of other student engagement monitoring features of Blackboard, in particular the Retention Centre within Evaluation and Blackboard Course Reports which enable instructors to track a range of user activity.

 

Impact

Our tracking showed that most students were engaging with the tasks daily, as required. We were very quickly able to identify a small group of students who were not engaging as hoped and target additional communication and support to these students.

Student feedback demonstrated that students perceived improvements in their language ability across the four skills (reading, writing, speaking and listening) and this was confirmed by their results at the end of the course. Student outcomes were good with over 90% of students achieving the scores they needed to progress to their chose degree programme. This compares favourably with the progression rate for the on-campus course which has run in previous years.

Feedback from teachers on the learning materials was very positive. One teacher commented that ‘The videos and Xerte lessons were excellent. As a new teacher I felt the course was very clear and it has been the best summer course I have worked on’. Teachers highlighted Xerte, the Discussion Boards and the interactive sessions as strengths of the course.

The materials and overall design of the course have informed the Pre-sessional Course (PSE 1) which is running this Autumn Term.

 

Reflections

Overall, we designed and delivered a course which met our objectives. Some reflections on the tools and approaches we employed are as follows:

Xerte lessons: these were definitely a successful part of the course enabling us to provide interactive asynchronous learning materials with immediate feedback to students. We also found the Xerte lessons enabled us to make coherent ‘packages’ of smaller tasks helping us to keep the Blackboard site uncluttered and easy to navigate.

Discussion Boards: teacher feedback indicated that this was a part of the course some felt was an enhancement of the previous F2F delivery. Points we found were key to the success of Discussion Board tasks were:

  • Creation of a new thread for each task to keep threads a manageable size
  • Linking to the specific thread from the task using hyperlinks
  • Detailed and specific Discussion Board task instructions for students broken down into steps of making an initial post and responding to classmates’ posts with deadlines for each step
  • Teacher presence on the Discussion Board
  • Teacher feedback on group use of the Discussion Board in live sessions to reinforce the importance of peer interaction

Small study groups: these were a helpful element of the course and greater use could have been made of them. For example, one teacher developed a system of having a rotating ‘group leader’ who took responsibility for guiding the group through an assigned task each week. In the future we could incorporate this approach and build more independent group work into the asynchronous learning materials to reinforce the importance of collaboration and peer learning.

Live sessions: student feedback showed clearly that this was an aspect of the course they particularly valued. Both students and teachers felt there should be more live contact but that these do not need to be long sessions; even an additional 30 minutes a day would have made a difference. Teachers and students commented that Teams provided a more stable connection for students in China than Blackboard Collaborate.

Blackboard Tests and monitoring features of Blackboard: these were undoubtedly useful tools for monitoring student engagement. However, they generate a great deal of data which is not always easy to interpret ‘at a glance’ and provides a fairly superficial account of engagement. Most teachers ended up devising their own tracking systems in Excel which enabled them to identify and track performance on certain key tasks each week.

 

Follow up

Taking into account the feedback from this year, materials developed could be used in future to facilitate a flipped learning approach on the course with students studying on campus or remotely. This would address the calls for more teacher-student interaction and enable the course to respond flexibility to external events. Currently, we are applying lessons learnt from the summer to the delivery of our Pre-sessional and Academic English Programmes running this term.

 

Links

The Pre-sessional English and Academic English Programme webpages give more details about the Programmes

Pre-sessional: http://www.reading.ac.uk/ISLI/study-in-the-uk/isli-pre-sessional-english.aspx

Academic English Programme: http://www.reading.ac.uk/ISLI/enhancing-studies/isli-aep.aspx

Introducing group assessment to improve constructive alignment: impact on teacher and student

Daniela Standen, School Director of Teaching and Learning, ISLI  Alison Nicholson, Honorary Fellow, UoR

Overview

In summer 2018-19 Italian and French in Institution-wide Language Programme, piloted paired Oral exams. The impact of the change is explored below. Although discussed in the context of language assessment, the drivers for change, challenges and outcomes are relevant to any discipline intending to introduce more authentic and collaborative tasks in their assessment mix. Group assessments constitute around 4% of the University Assessment types (EMA data, academic year 2019-20).

Objectives

  • improve constructive alignment between the learning outcomes, the teaching methodology and the assessment process
  • for students to be more relaxed and produce more authentic and spontaneous language
  • make the assessment process more efficient, with the aim to reduce teacher workload

Context

IWLP provides credit-bearing language learning opportunities for students across the University. Around 1,000 students learn a language with IWLP at Reading.

The learning outcomes of the modules talk about the ability to communicate in the language.  The teaching methodology employed favours student–student interaction and collaboration.  In class, students work mostly in pairs or small groups. The exam format, on the other hand, was structured so that a student would interact with the teacher.

The exam was often the first time students would have spoken one-to-one with the teacher. The change in interaction pattern could be intimidating and tended to produce stilted Q&A sessions or interrogations, not communication.

Implementation

Who was affected by the change?

221 Students

8 Teachers

7 Modules

4 Proficiency Levels

2 Languages

What changed?

  • The interlocution pattern changed from teacher-student to student-student, reflecting the normal pattern of in-class interaction
  • The marking criteria changed, so that quality of interaction was better defined and carried higher weight
  • The marking process changed, teachers as well as students were paired. Instead of the examiner re-listening to all the oral exams in order to award a mark, the exams were double staffed. One teacher concentrated on running the exam and marking using holistic marking criteria and the second teacher listened and marked using analytic rating scales

Expected outcomes

  • Students to be more relaxed and produce more authentic and spontaneous language
  • Students to student interaction creates a more relaxed atmosphere
  • Students take longer speaking turns
  • Students use more features of interaction

(Hardi Prasetyo, 2018)

  • For there to be perceived issues of validity and fairness around ‘interlocutor effects’ i.e. how does the competence of the person I am speaking to affect my outcomes. (Galaczi & French, 2011)

 Mitigation

  • Homogeneous pairings, through class observation
  • Include monologic and dialogic assessment tasks
  • Planned teacher intervention
  • Inclusion of communicative and linguistic marking criteria
  • Pairing teachers as well as students, for more robust moderation

Impact

Methods of evaluation

Questionnaires were sent to 32 students who had experienced the previous exam format to enable comparison.  Response rate was 30%, 70% from students of Italian. Responses were consistent across the two languages.

8 Teachers provided verbal or written feedback.

 Students’ Questionnaire Results

Overall students’ feedback was positive.  Students recognised closer alignment between teaching and assessment, and that talking to another student was more natural. They also reported increased opportunities to practise and felt well prepared.

However, they did not feel that the new format improved their opportunity to demonstrate their learning or speaking to a student more relaxing.  The qualitative feedback tells us that this is due to anxieties around pairings.

Teachers’ Feedback

  • Language production was more spontaneous and authentic. One teacher commented ‘it was a much more authentic situation and students really helped each other to communicate’
  • Marking changed from a focus on listening for errors towards rewarding successful communication
  • Workload decreased by up to 30%, for the average student cohort and peaks and troughs of work were better distributed

Reflections

Overall, the impact on both teachers and students was positive. Student reported that they were well briefed and had greater opportunities to practise before the exam. Teachers reported a positive impact on workloads and on the students’ ability to demonstrate they were able to communicate in the language.

However, this was not reflected in the students’ feedback. There is a clear discrepancy in the teachers and students’ perception of how the new format allows students to showcase learning.

Despite mitigating action being taken, students also reported anxiety around ‘interlocutor effect’.  Race (2014) tells us that even when universities have put all possible measures in place to make assessment fair they often fail to communicate this appropriately to students. The next steps should therefore focus on engaging students to bridge this perception gap.

Follow-up

Follow up was planned for the 2019-20 academic cycle but could not take place due to the COVID-19 pandemic.

References

Galaczi & French, in Taylor, L. (ed.), (2011). Examining Speaking: Research and practice in assessing second language speaking. Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo, Dehli, Tokyo, Mexico City: CUP.

Fulcher, G. (2003). Testing Second Language Speaking. Ediburgh: Pearson.

Hardi Prasetyo, A. (2018). Paired Oral Tests: A literature review. LLT Journal: A Journal on Language and Language Teaching, 21(Suppl.), 105-110.

Race, P. (2014) Making Learning happen (3rd ed.), Los Angeles; London: Sage

Race, P. (2015) The lecturer’s toolkit : a practical guide to assessment, learning and teaching (4th ed.), London ; New York, NY : Routledge, Taylor & Francis Group

 

How ISLI’s Assessment Team created an online oral exam for the Test of English for Educational Purposes (TEEP)

Fiona Orel– International Study and Language Institute (ISLI)

 

Overview

ISLI’s Test of English for Educational Purposes (TEEP) is administered at the end of pre-sessional courses as a measure of students’ academic English proficiency. The speaking test has traditionally been an academic discussion between two students that is facilitated by an interlocutor and marked by an observer.

This case study outlines the process of creating a version of the TEEP speaking test for 1-1 online delivery.

Objectives

  • To create an online TEEP speaking test that could be administered at the beginning of June to 95 students
  • To ensure reliability and security of results
  • To support students and teachers with the transition

Context

The Pre-sessional English course 3 (PSE 3) started in April during the period of lockdown.  At the end of the course all students sit a TEEP test which includes a test of speaking skills. We realised that we wouldn’t be able to administer the usual two student + two teachers test given the constraints with technology and the changes in teaching and learning which reduced to a certain degree the students’ opportunities for oral interaction and that we would need to develop a new 1-1 test that maintained the validity and reliability of the original TEEP Speaking test.

Implementation

We had two main objectives: to create a valid online 1-1 speaking test, and to make sure that the technology we used to administer the test was simple and straight-forward for both teachers and students, and would have reasonably reliable connectivity in the regions where students were based (China, Middle East and UK).

The first thing we needed to do was to return to our test specifications – what exactly were we hoping to assess through the oral exam? The original face-to-face test had five criteria: overall communication, interaction, fluency, accuracy and range, and intelligibility. We knew that interaction had been impacted by the move online, but decided that the aspect of responding appropriately to others was a crucial aspect of interaction that needed to remain and included this in the ‘overall communication’ criteria. Recognising also that interlocutors would also need to be examiners, we worked on streamlining the criteria to remove redundancy and repetition and to ensure that each block contained the same type of description in the same order thereby making it easier for tutors to skim and recall.

We then worked out exactly what functions and skills in speaking that we wanted to test and how we could do that while mostly working with existing resources. We aligned with the original test specifications by testing students’ ability to:

  • Provide appropriate responses to questions and prompt
  • Describe experiences and things
  • Give and justify an opinion by, for example, stating an opinion, explaining causes and effects, comparing, evaluating.

The test format that enabled this was:

  • Part one: an interview with the student about their studies and experience of studying online
  • Part two: problem solving scenario: Students are introduced to a problem which the teacher screen shares with them and they are given three possible solutions to compare, evaluate and rank most to least effective
  • Part three: abstract discussion building on the talk given in part two

The final stage was trialling a platform to conduct the tests. We had considered Zoom due to its reliability but discounted it due to security concerns. BB Collaborate had connectivity issues in China so we decided to use Teams as connectivity was generally better and students and teachers were familiar with the platform as they had been using it for tutorials. Due to the spread of students over time zones, we decided to spread the speaking tests over three mornings finishing by 11:00 BST on each day. We kept the final slot on Friday free for all teachers to enable rescheduling of tests for any student experiencing issues with connectivity on the day.

Finally, we needed to help teachers and students prepare for the tests. For students, learning materials were produced with videos of a sample test, there was a well-attended webinar to introduce the format and requirements, and the recording of this webinar was made available to all students along with a document on their BB course. This instructed them what to do before test day and what to expect on test day.

The test format and procedures were introduced to teachers with instructions for tasks to do before the test, during the test, and after the test. There was also an examiner’s script prepared with integrated instructions and speech to standardise how the tests were administered. Each test was recorded to ensure security and to enable moderation. All students had to verify their identity at the start of the test. The test recording caused some problems as we knew that the video would have to be downloaded and deleted from Stream before anyone else or the student in the Team meeting who had been tested could access it. For this reason we allowed 40 minutes for each 20 minute interview as downloading was sometimes a lengthy process depending on internet speeds. We had 2 or 3 people available each day to pick up any problems such as a teacher being unwell or having tech issues, and/or a student experiencing problems. This worked well and on the first two days we did have to reschedule a number of tests, fortunately, all worked well on the final day. The teachers were fully committed and worked hard to put students at ease, informal feedback from students was the appreciation of an opportunity to talk 1-1 with a tutor, and tutors said that the test format allowed for plenty of evidence upon which to base a decision.

Impact

The test was successful overall and there were fewer technical issues than we had anticipated. Teachers and students were happy with it as an assessment measure and we were able to award valid and reliable grades.

Working together collaboratively with the teachers and the Programme Director was incredibly rewarding and meant that we had a wide resource base of talent and experience when we did run into any problems.

Reflections

Incredibly detailed planning, the sharing of information across Assessment and Pre-sessional Teams, and much appreciated support from the TEL team helped to make the test a success. Students and teachers had very clear and detailed instructions and knew exactly what was expected and how the tests would be conducted. The sharing of expertise across teams meant that problems were solved quickly and creatively, and it is good to see this practice becoming the norm.

We need to work on the issue of downloading and deleting the video after each test as this caused some anxiety for some teachers with slower internet connection. We also need to have more technical support available, especially on the first day. Most students had tested their equipment as instructed but some who hadn’t experienced issues. It would be even better if a similar activity could be built into the course so that teachers and students experience the test situation before the final test.

Follow up

ISLI’s Assessment Team is now preparing to administer the same tests to a much larger cohort of students at the end of August. We will apply the lessons learned during this administration and work to make the process easier for teachers.

The DEL Feedback Action Plan

Madeleine Davies, Cindy Becker and Michael Lyons- SLL

Overview

A feedback audit and consultation with the Student Impact Network revealed a set of practices DEL needs to amend. The research produced new student-facing physical and online posters, designed by a ‘Real Jobs’ student, to instruct students on finding their feedback online, and generated ‘marking checklists’ for staff to indicate what needs to be included in feedback and what needs to be avoided.

Objectives

  • To assess why students scored DEL poorly on feedback in NSS returns
  • To consult with students on types of feedback they considered useful
  • To brief colleagues on good practice feedback
  • To produce consistency (but not conformity) in terms of, for example, the amount of feedback provided, feedforward, full feedback for First Class work, etc.
  • To assess whether marking rubrics would help or hinder DEL feedback practice

Context

The ‘DEL Feedback Action Project’ addresses the persistent issue of depressed NSS responses to Department of English Literature assessment and feedback practices. The responses to questions in ‘teaching quality’ sections are favourable but the 2018 NSS revealed that, for English Studies, Reading is in the third quartile for the ’Assessment and Feedback’ section and the bottom quartile for question 8 (scoring 64% vs the 74% median score) and question 9 (scoring 70% vs the 77% median score).

In October 2018, DEL adopted eSFG. An EMA student survey undertaken in January 2019 polled 100 DEL students and found that, though students overwhelmingly supported the move to eSFG, complaints about the quality of DEL feedback persisted.

Implementation

Michael Lyons began the project with an audit of DEL feedback and identified a number of areas where the tone or content of feedback may need improving. This material was taken to the Student Impact Network which was shown anonymised samples of feedback. Students commented on it. This produced a set of indicators which became the basis of the ‘marking checklist’ for DEL staff. Simultaneously, DEL staff were asked to discuss feedback practice in ‘professional conversations’ for the annual Peer Review exercise. This ensured that the combined minds of the whole department were reflecting on this issue

Student consultation also revealed that many students struggle to find their feedback online. With this in mind, we collaborated with TEL to produce ‘maps to finding feedback’ for students. A ‘Real Jobs’ student designer converted this information into clear, readable posters which can be displayed online or anywhere in the University (the information is not DEL-specific). The posters will be of particular use for incoming students but our research also suggested that Part 3 students are often unaware of how to access feedback.

The results of the initial audit and consultation with students indicated where our feedback had been falling short. We wrote a summary of these finding for DEL HoD and DDTL.

Research into marking rubrics revealed that DEL marking would not be suited to using this feedback practice. This is because they can be inflexible and because DEL students resist ‘generic’ feedback.

Impact

The student-facing posters and staff-facing ‘marking checklist’ speak to two of the main issues with DEL feedback that were indicated by students. The latter will deter overly-brief, curt feedback and will prompt more feedforward and comment about specific areas of the essay (for example, the Introductory passage, the essay structure, referencing, grammar, use of secondary resources, etc).

With DEL staff now focused on the feedback issue, and with students equipped to access their feedback successfully, we are hoping to see a marked improvement in NSS scores in this area in 2020-21.

For ‘surprises’, see ‘Reflections’.

Reflections

The pressure on academic staff to mark significant amounts of work within tight deadlines can lead to potential unevenness in feedback. We are hoping that our research prompts DEL to streamline its assessment practice to enhance the quality and consistency of feedback and feedforward.

Students’ responses in the Student Impact Network also suggested that additional work is required on teaching students how to receive feedback. Over-sensitivity in some areas can produce negative scores. With this in mind, the project will terminate with an equivalent to the ‘marking checklist’ designed for students. This will remind students that feedback is anonymous, objective, and intended to pave the way to success.

Follow up

Monitoring NSS DEL feedback scores in the 2020-21 round, and polling students in the next session to ensure that they are now able to access their feedback.

Continuing to reflect on colleagues’ marking workload and the link between this and unconstructive feedback.

 

 

Developing psychoeducational materials for individuals with learning disabilities

Dr Allán Laville, a.laville@reading.ac.uk, (Dean for D&I and Lecturer in Clinical Psychology) and Charlotte Field (Research Assistant and student on MSci Applied Psychology)

Overview

To improve access to psychoeducational materials by addressing the diverse needs of those accessing Improving Access to Psychological Therapy (IAPT) services. We worked on creating materials that could be used to describe psychological disorders such as Depression and Generalized Anxiety Disorder (GAD) to those who have learning disabilities. Here we reflect upon the benefits of completing this project via a student- staff partnership as well as the potential benefits of using within IAPT.

Objectives

  • This project was funded by SPCLS Teaching & Learning Enhancement Fund and was to create psychoeducational materials suitable for those with learning disabilities that depict Depression, GAD and Panic Disorder.
  • To effectively utilise student and staff feedback in the creation of these materials.

Context

The above project was undertaken as within IAPT, Psychological Wellbeing Practitioners (PWPs) typically use materials that are text heavy when explaining psychological disorders. This can create access barriers to those with learning disabilities, arguably within service and at a university teaching level.

The aim of the project was to create visual representations of how the person may be feeling depending on the psychological disorder.

Implementation

Allán Laville (Dean for Diversity and Inclusion) designed the psychoeducational materials for learning disabilities concept and then approached Charlotte Field to see whether she wanted to take part in the development of these materials. It was important to include Charlotte here as she is training as a PWP and has also studied Art.

Charlotte Field’s experience

The preliminary stage in the project involved doing rough sketches of how Depression, GAD and Panic would be represented. These were discussed and evaluated within an initial focus group with other students on the MSci Applied Psychology Cohort 5. The subsequent reflection and review of the feedback received enabled me to produce drawings that were more interactive as well as providing a more literal and figurative version of each disorder to help make things clearer. In doing so, making the drawings more accessible and appropriate for those with learning disabilities. I had the opportunity to review feedback on the completed drawings for a second time before the drawings were submitted.

Impact

Charlotte shares her view of the impact of completing this activity:

The materials here have been developed to add to the resources which could improve access for those with learning disabilities within Improving Access to Psychological Therapies (IAPT). As the rest of the MSci cohort and I are training as PWPs this was especially relevant to develop our clinical skills. These materials will be used in the training of future MSci cohorts – both within in-class role-plays and summative role-play assessments.

Reflections

Allán Laville reflections:

The student-staff partnership was key to the success of the project as we needed to ensure that the student voice was at the forefront. This was achieved in the work Charlotte completed herself as well as within the focus group and subsequent feedback on the psychoeducational materials over email. Based on this positive experience, we are keen to continue this approach to innovative T&L practices.

Charlotte Field’s reflections:

The student-staff partnership is of great importance as it builds collaboration and crucial links between students and staff. This is particularly important with projects such as this as it combines the knowledge and expertise from experienced staff members with the student’s current experience working within these services.

Follow up

In future, we will aim to develop similar psychoeducational materials for treatment interventions within Low Intensity Cognitive Behavioural Therapy. For example, materials for Behavioural Activation, which aims to increase individual’s routine, necessary and pleasurable activities to improve one’s mood.  This intervention would lend itself well to pictorial representations.

Using Psychological Techniques to get the most out of your Feedback

Zainab Abdulsattar (student – Research Assistant), Tamara Wiehe (staff – PWP Clinical Educator) and Dr Allán Laville, a.laville@reading.ac.uk, (Dean for D&I and Lecturer in Clinical Psychology). School of Psychology and CLS.

Overview

To help Part 3 MSci Applied Psychology students address the emotional aspect of engaging with and interpreting assessment feedback, we have created a Blackboard feedback tool, which draws on self-help strategies used in NHS Mental Health services. This was a TLDF funded project by CQSD and we reflect upon the usefulness of the tool in terms of helping students manage their assessment feedback in a more positive and productive way for both now and the future.

Objectives

  • To explore the barriers to interpreting and implementing feedback through the creation of a feedback-focused tool for Blackboard
  • To transfer aspects of NHS self-help strategies to the tool
  • To acknowledge the emotional aspect of addressing assessment feedback in Higher Education
  • To support students to engage effectively with feedback

Context

Assessment and feedback are continually rated as the lowest item on student surveys despite efforts from staff to address this. Whilst staff can certainly continue to improve on their practices surrounding providing feedback, our efforts turned to how we could improve student engagement in this area. Upon investigation of existing feedback-focused tools, it has become apparent that many do not acknowledge the emotional aspect of addressing assessment feedback. For example, the ‘Development Engagement with Feedback Toolkit (DEFT)’ has useful components like a glossary helping students with academic jargon, but it does not provide resources to help with feedback related stress. The aim was to address the emotional aspect of interpreting feedback in the form of a self-help tool.

Implementation

 Zainab Abdulsattar’s experience:

Firstly, we carried out a literature review on feedback in higher education and the use of self-help resources like cognitive restructuring within the NHS used to treat anxiety and depression. These ideas were taken to the student focus group: to gather students’ thoughts and opinions on what type of resource they would like to help them understand and use their feedback.

Considering ideas from the literature review and the focus group, we established the various components of the tool: purpose of feedback video, problem solving and cognitive restructuring techniques, reflective log and where to go for further support page. Then, we started the creation of our prototype Blackboard tool. At tool creation stage, we worked collaboratively with the TEL team (Maria, Matt and Jacqueline) to help format and launch the tool. Upon launch, students were given access to the tool via Blackboard and a survey to complete once they had explored and used the tool.

Impact

Our prototype Blackboard tool met the main objective of the project, to address the emotional aspect of the interpreting assessment feedback. The cognitive restructuring resource aimed to identify, challenge and re-balance students negative or stressful thoughts related to receiving feedback. Some students reported in the tool survey that they found this technique useful.

As well as this, the examples seemed to help students link their past experiences of not getting a good grade. Students also appreciated the interactive features like the video of the lecturer [addressing the fact that feedback is not a personal attack] and were looking forward to the tool being fully implemented during their next academic year. Overall, the student survey was positive with the addition of some suggestions such as making the tool smart phone friendly and altering the structure of the main page for ease of use.

Reflections

Zainab Abdulsattar’s reflections:

The success of the tool lied in the focus group and literature review contributions because the students’ focus group tool ideas helped to further contribute to the evidence-based self-help ideas gathered from the latter. Importantly, the hope is that the tool can act as an academic aid promoting and improving students’ independence in self-managing feedback in a more positive and productive way. Hopefully this will alleviate feedback-related stress for both now and the future in academic and work settings.

Follow up

In the future, we hope to expand the prototype tool into a more established feedback-focused tool. To make the tool even more use-friendly, we could consider improving the initial main contents page. For example, presenting the options like ‘I want to work on improving x’ then lead on to the appropriate self-help resource instead of simply starting with the resource options [e.g. problem solving, reflective log].

Developing and embedding electronic assessment overviews

Dr Allán Laville, a.laville@reading.ac.uk , Chloe Chessell and Tamara Wiehe

Overview

To develop our assessment practices, we created electronic assessment overviews for all assessments in Part 3 MSci Applied Psychology (Clinical) programme. Here we reflect on the benefits of completing this project via a student-staff partnership as well as the realised benefits for students.

Objectives

  • To create electronic assessment overviews for all 8 assessments in Part 3 MSci Applied Psychology (Clinical).
  • To create the overviews via a student-staff partnership with Chloe Chessell. Chloe is a current PhD student and previous MSci student.

Context

The activity was undertaken due to the complexity of the Part 3 assessments. In particular, the clinical competency assessments have many components and so, only providing an in-class overview has some limitations. The aim was for students to be able to review assessment overviews at any time via Blackboard.

Implementation

Allán Laville (Dean for Diversity and Inclusion) and Tamara Wiehe (MSci Clinical Educator) designed the electronic assessment overview concept and then approached Chloe Chessell to see whether she wanted to take part in the development of these overviews. It was important to include Chloe here as she has lived experience of completing the programme and therefore, can offer unique insight.

Chloe Chessell’s experience

The first stage in assisting with the development of electronic assessment resources for MSci Applied Psychology (Clinical) students involved reflecting upon the information my cohort was provided with during our Psychological Wellbeing Practitioner (PWP) training year. Specifically, this involved reflecting upon information about the assessments that I found particularly helpful; identifying any further information which would have benefitted my understanding of the assessments; and suggesting ways to best utilise screencasts to supplement written information about the assessments. After providing this information, I had the opportunity to review and provide feedback on the screencasts which had been developed by the Clinical Educators.

Impact

Chloe shares her view of the impact of completing this activity:

The screencasts that have been developed added to the information that I had as a student, as this format allows students to review assessment information in their own time, and at their own pace. Screencasts can also be revisited, which may help students to ensure they have met the marking criteria for a specific assessment. Furthermore, embedded videos/links to information to support the development of key writing skills (e.g. critical analysis skills) within these screencasts expand upon the information my cohort received, and will help students to develop these skills at the onset of their PWP training year.

Reflections

Staff reflections: The student-staff partnership was key to the success of the project as we needed to ensure that the student voice was at the forefront. The electronic assessment overviews have been well received by students and we are pleased with the results. Based on this positive experience, we now have a further 4 student-staff projects that are currently being completed and we hope to publish on the T&L Exchange in due course.

Chloe Chessell’s reflections:

I believe that utilising student-staff partnerships to aid course development is crucial, as it enables staff to learn from student’s experiences of receiving course information and their views for course development, whilst ensuring overall course requirements are met. Such partnerships also enable students to engage in their course at a higher level, allowing them to have a role in shaping the course around their needs and experiences.

Follow up

In future, we will aim to include interactive tasks within the screencasts, so students can engage in deep level learning (Marton, 1975). An example could be for students to complete a mind map based on the material that they have reviewed in the electronic assessment overview.