Learning to Interpret and Assess Complex and Incomplete Environmental Data

Andrew Wade a.j.wade@reading.ac.uk

Department of Geography and Environmental Sciences

Overview

Field work is well known to improve student confidence and enhance skills and knowledge, yet there is evidence for a decline in field work in Secondary Education, especially amongst A-level Geography students. This is problematic as students are entering Geography and Environmental Science degree programmes with reduced skills and confidence around field-based data collection and interpretation, and this appears to be leading to an apprehension around data collection for dissertations. A simple field-based practical where 47 Part 2 Geography and Environmental Science students tested their own hypotheses about factors that control water infiltration into soils was developed. Improved confidence and appreciation of critical thinking around environmental data was reported in a survey of the student experience. Student coursework demonstrated that attainment was very good, and that skills and critical thinking can be recovered and enhanced with relatively simple, low-cost field-based practical classes that can be readily embedded to scaffold subsequent modules, including the dissertation.

Context

The importance of field work is well established in Geography and Environmental Science as a means of active and peer-to-peer learning. However, students appear to have little confidence in designing their own field work for hypotheses testing when they arrive for Part 1, probably due to a decline in field work in Secondary Education (Kinder 2016, Lambert and Reiss 2014). Within the Geography and Environmental Science programmes, there is a part two, 20 credit ‘Research Training’ module that develops the same skills. However, this research training module and the dissertation are seen by the students as being of high risk in that they perceive a low mark will have a significant negative impact on the overall degree classification. Consequently, students are seemingly risk adverse around field-based projects. The idea here is to make field-based training more commonplace throughout multiple modules through inclusion of relatively simple practical training, so that hypotheses testing, critical thinking and confidence with ‘messy’ environmental data become intuitive and students are at ease with these concepts. In parallel, GES module cohorts have increased in recent years and this is an additional reason to develop simple, low-cost practical classes.

Objectives

The aim of the project was to determine if a simple, field-based practical would help boost student confidence around field data collection and interpretation, and hypotheses testing. The objective was to give the students a safe and supportive environment in which to develop their own hypotheses and method for field data collection, and to learn to interpret often ‘messy’ and ‘complex’ environmental data.

Figure 1: The practical class took place on the hill-slope on campus between the Atmospheric Observatory and Whiteknights Lake on the 28 October 2019 over 4 hours in total.

 

Figure 2: Students used a Decagon Devices Mini-Disc Infiltrometer to measure unsaturated hydraulic conductivity to test their own hypotheses about the factors controlling infiltration

Implementation

A practical was designed where 47 Part 2 students, working in groups of four or five, developed their own hypotheses around the factors controlling rainfall infiltration on a hill-slope in the class room following an in-class briefing, and then tested these hypotheses in the field using Mini Disc infiltrometers (Figs. 1, 2 and 3). There was a further follow-up session where each student spent two hours processing the data collected and was briefed on the coursework write-up.

Figure 3: The students tested hypotheses around distance from the lake, vegetation and soil type, soil moisture and soil compaction. Each student group spent two hours in the field.

Impact

Of 40 students who responded to an on-line survey:

  • 37 agreed the practical helped develop their critical thinking skills around complex and incomplete environmental data;
  • 36 agreed they were now better able to deal with uncertainty in field-based measurements;
    and 38 feel more confident working in the field.

Student quotes included:

  • “The practical was very useful in helping to understand the processes happening as well as being more confident in using the equipment.”
  • “I thought the practical was good as it was another way to process information which tends to work better for me, doing and seeing how it works allows me to gain a higher understanding in the processes”

The majority of students gained first class and upper second-class marks for the project write-up and the reports submitted demonstrated good critical thinking skills in the interpretation of the infiltration measurements. There has been a noticeable increase in the number of students opting for hydrology-based dissertations.

Reflections

Confidence and critical thinking skills can be enhanced with relatively simple, low-cost field-based practicals that scaffold subsequent modules including Research Training for Geographers and Environmental Science, and the dissertation, and focus on hypotheses testing in addition to knowledge acquisition. Each student spent 2 hours in the field on campus and 2 hours processing their data, with further time on the coursework write-up. This seems a reasonable investment in time given the benefits in confidence, skills and knowledge. Embedding such practicals should not replace the larger skills-based modules, such as Research Training, nor should such practical classes replace entirely those that focus more on knowledge acquisition, but these practical classes, where students explore their own ideas, appear to be a useful means to boost student confidence and critical thinking skills at an early stage. The practical was also an excellent means of encouraging peer to peer interaction and learning, and this and similar practical classes have good potential for the integration of home and NUIST students.

Follow up

Embed similar practical classes in part one modules to build confidence at the outset of the degree programme and, at part three, to further enable integration of home and NUIST students.

Links and References

Kinder A. 2016. Geography: The future of fieldwork in schools. Online: http://www.sec-ed.co.uk/best-practice/geography-the-future-of-fieldwork-in-schools/ (Last accessed: 03 Jan 2020).

Lambert D and Reiss MJ. 2014, The place of fieldwork in geography and science qualifications, Institute of Education, University of London. ISBN: 978-1-78277-095-4. pp. 20

The impact of COVID upon practical classes in Part 1 chemistry – an opportunity to redevelop a core module

Philippa Cranwell p.b.cranwell@reading.ac.uk, Jenny Eyley, Jessica Gusthart, Kevin Lovelock and Michael Piperakis

Overview

This article outlines a re-design that was undertaken for the Part 1 autumn/spring chemistry module, CH1PRA, which services approximately 45 students per year. All students complete practical work over 20 weeks of the year. There are four blocks of five weeks of practical work in rotation (introductory, inorganic, organic and physical) and students spend one afternoon (4 hours) in the laboratory per week. The re-design was partly due to COVID, as we were forced to critically look at the experiments the students completed to ensure that the practical skills students developed during the COVID pandemic were relevant for Part 2 and beyond, and to ensure that the assessments students completed could also be stand-alone exercises if COVID prevented the completion of practical work. COVID actually provided us with an opportunity to re-invigorate the course and critically appraise whether the skills that students were developing, and how they were assessed, were still relevant for employers and later study.

Objectives

• Redesign CH1PRA so it was COVID-safe and fulfilled strict accreditation criteria.
• Redesign the experiments so as many students as possible could complete practical work by converting some experiments so they were suitable for completion on the open bench to maximise laboratory capacity
• Redesign assessments so if students missed sessions due to COVID they could still collect credit
• Minimise assessment load on academic staff and students
• Move to a more skills-based assessment paradigm, away from the traditional laboratory report.

Context

As mentioned earlier, the COVID pandemic led to significant difficulties in the provision of a practical class due to restrictions on the number of students allowed within the laboratory; 12 students in the fumehoods and 12 students on the open bench (rather than up to 74 students all using fumehoods previously). Prior to the redesign, each student completed four or five assessments per 5-week block and all of the assessments related to a laboratory-based experiment. In addition, the majority of the assessments required students to complete a pro-forma or a technical report. We noticed that the pro-formas did not encourage students to engage with the experiments as we intended, therefore execution of the experiment was passive. The technical reports placed a significant marking burden upon the academic staff and each rotation had different requirements for the content of the report, leading to confusion and frustration among the students. The reliance of the assessments upon completion of a practical experiment was also deemed high-risk with the advent of COVID, therefore we had to re-think our assessment and practical experiment regime.

Implementation

In most cases, the COVID-safe bench experiments were adapted from existing procedures, allowing processing of 24 students per week (12 on the bench and 12 in the fumehood), with students completing two practical sessions every five weeks. This meant that technical staff did not have to familiarise themselves with new experimental procedures while implementing COVID guidelines. In addition, three online exercises per rotation were developed, requiring the same amount of time as the practical class to complete therefore fulfilling our accreditation requirements. The majority of assessments were linked to the ‘online practicals’, with opportunities for feedback during online drop-in sessions. This meant that if a student had to self-isolate they could still complete the assessments within the deadline, reducing the likelihood of ECF submissions and ensuring all Learning Outcomes would still be met. To reduce assessment burden on staff and students, each 5-week block had three assessment points and where possible one of these assessments was marked automatically, e.g. using a Blackboard quiz. The assessments themselves were designed to be more skills-based, developing the softer skills students would require upon employment or during a placement. To encourage active learning, the use of reflection was embedded into the assessment regime; it was hoped that by critically appraising performance in the laboratory students would remember the skills and techniques that they had learnt better rather than the “see, do, forget” mentality that is prevalent within practical classes.

Examples of assessments include: undertaking data analysis, focussing on clear presentation of data; critical self-reflection of the skills developed during a practical class i.e. “what went well”, “what didn’t go so well”, “what would I do differently?”; critically engaging with a published scientific procedure; and giving a three-minute presentation about a practical scientific technique commonly-encountered in the laboratory.

Impact

Mid-module evaluation was completed using an online form, providing some useful feedback that will be used to improve the student experience next term. The majority of students agreed, or strongly agreed, that staff were friendly and approachable, face-to-face practicals were useful and enjoyable, the course was well-run and the supporting materials were useful. This was heartening to read, as it meant that the adjustments that we had to make to the delivery of laboratory based practicals did not have a negative impact upon the students’ experience and that the re-design was, for the most part, working well. Staff enjoyed marking the varied assessments and the workload was significantly reduced by using Blackboard functionality.

Reflections

To claim that all is perfect with this redesign would be disingenuous, and there was a slight disconnect between what we expected students to achieve from the online practicals and what students were achieving. A number of the students polled disliked the online practical work, with the main reason being that the assessment requirements were unclear. We have addressed by providing additional videos explicitly outlining expectations for the assessments, and ensuring that all students are aware of the drop-in sessions. In addition, we amended the assessments so they are aligned more closely with the face-to-face practical sessions giving students opportunity for informal feedback during the practical class.

In summary, we are happy that the assessments are now more varied and provide students with the skills they will need throughout their degree and upon graduation. In addition, the assessment burden on staff and students has been reduced. Looking forward, we will now consider the experiments themselves and in 2021/22 we will extend the number of hours of practical work that Part 1 students complete and further embed our skill-based approach into the programme.

Follow up

 

Links and References

Can Online Learning Facilitate Meaningful Interpersonal Connection?

Shelley Harris

shelley.harris@reading.ac.uk

Overview

As part of my role as a Creative Writing lecturer, I link undergraduates with professionals from the publishing industry, offering – among other things – extracurricular events for students in the School of Literature and Languages. In the past, these have broadened students’ understanding of the roles involved in publishing and given them hands-on, CV-friendly experience of the skills required in those roles. The goal is to improve students’ knowledge, confidence and employability rather than secure them a job, though sometimes they are given an unexpected leg-up: after a course in 2018, the visiting editor was so impressed by one of our students that he introduced her to a contact at Hachette.

Whatever the specifics of the event, I always seek to bring those professionals into the room – in part to demystify this competitive sector, and in part because, as a ‘high context’ industry (Hall 1977), it has a historically offered jobs to the privileged: those with cultural capital. My aim is to give all our students the chance to accrue such capital.

 

 

Objectives

My ambition for the online event remained the same as its original iteration: to facilitate meaningful connections between our students and the industry guests.

Context

In Spring 2020 I organised an event for Professional Track which – after a panel discussion – would put students into informal breakout groups with early-career publishing professionals. This sort of personal contact is rare, and hugely beneficial for students with an ambition to work in publishing.

The event was scheduled for April, the tea and cake were ordered – and then lockdown occurred. With some trepidation, I redesigned it as an online experience using Blackboard Collaborate. But could an online event really enable the sorts of human connection offered by a face-to-face meeting?

Implementation

TEL’s one-to-one help sessions were a gamechanger for this project, with TEL advisor Chris Johnson offering expert guidance, including the sorts of troubleshooting tips that make all the difference to an online project. There isn’t enough space here to detail them all, but I would hugely recommend making the most of TEL’s expertise.

On the day, the event began with a conventional panel discussion in which I interviewed the guests (an editor, a publicist, a books marketer and a literary agent’s assistant) about their routes into publishing and their experience of work. Students turned off their mics and video, watched the panel and put questions into the text chat, which I then moderated. Next, I put students into small groups using Collaborate’s ‘Breakout Groups’ feature. Each included one publishing professional. I invited all participants to turn on their cameras and mics so that discussion could be more personal and informal. As facilitator, I moved between groups – not participating, but making sure things were running smoothly.

Impact

To what extent was meaningful interpersonal connection facilitated by this online event? Qualitative feedback from students suggests that the ensuing discussions were fruitful. One respondent said: ‘Engaging with the industry professionals in the smaller groups was something that I found to be particularly helpful’, while another said they appreciated ‘talking to individuals with real experience in the sector I am curious about working in.

As with the previous course, one student benefitted in an immediate way; with a guest speaker offering to show her artwork to a children’s publisher. It was encouraging evidence that remote events can bring people together.

Indeed, there were aspects of the online event that seemed to offer advantages over face-to-face meeting; online, there’s a hierarchy of depersonalisation, from a simulacrum of face-to-face (cameras and mics on) through audio only, to text chat which identifies students by name and finally the anonymity of Collaborate’s whiteboard function. This is hard to reproduce in a bricks-and-mortar seminar room – and it liberates participants.

An example of that liberation came in two of the small group discussions, when talk was slow to start and the guest speakers asked students to put questions into text chat instead. Conversation picked up, and once it was under way, students were invited to activate their cameras and microphones. On reflection, I’d start all small group discussion like this next time. The feedback below (in answer to a question about the ways in which the online event was better than an in-person one) suggests how much safer this can make students feel, and how it can lower inhibitions about joining in.

Reflections

We all accept that in-person encounters offer us ways of connecting to each other that are hard to reproduce online, but the reverse is also true. It’s something our neurodivergent students already know (Satterfield, Lepage and Ladjahasan 2015), but my experience on this project has made me sharply aware of the ways in which all participants stand to benefit.

The ‘Get into Publishing’ event has left me cautiously optimistic about facilitating meaningful social connections in the online environment, and keen to further explore its unique social opportunities. And, as Gilly Salmon (2011) makes clear, those connections are not just ‘extras’ – they are absolutely central to successful remote learning.

Links and References

Hall E T (1977), Beyond Culture. Garden City, NY: Anchor Books

Satterfield D, Lepage C & Ladjahasan N (2015) ‘Preferences for online course delivery methods in higher education for students with autism spectrum disorders’, Procedia Manufacturing, 3, pp. 3651-3656

Salmon G (2011), E-Moderating : The Key to Online Teaching and Learning. New York: Routledge, p36

Improving student assessment literacy & engaging students with rubrics

Dr. Allan Laville

School of Psychology & Clinical Languages Sciences

In this 14 minute video, early rubrics adopter Dr. Allan Laville shares how he and colleagues in Psychology have sought to improve student assessment literacy, and have successfully engaged students with their assessment rubrics by embedding analysis of them into their in-class teaching and by using screencasts, discussion boards and student partnership. Lots of useful ideas and advice – well worth a watch.

Promoting and Tracking Student Engagement on an Online Undergraduate Pre-sessional Course

Sarah Mattin: International Study and Language Institute

Overview

This case study outlines approaches to fostering an active learning environment on the University’s first fully online Undergraduate Pre-sessional Course which ran in Summer 2020 with 170 students. It reports staff and student feedback and reflects on how lessons learnt during the summer can inform ISLI’s continued online delivery this autumn term and beyond.

 

Objectives

  • To design and deliver an online Pre-sessional Course to meet the needs of 170 students studying remotely, mostly in China
  • To promote student engagement in learning activities in an online environment
  • To devise effective mechanisms for tracking student engagement and thus identify students who may require additional support

 

Context

The Pre-sessional Programme (PSE) is an English for Academic Purposes (EAP) and academic skills development programme for degree offer holders who require more study to meet the English Language requirements of their intended programme. The programme runs year-round and, in the summer, has separate UG and PG courses. We would usually expect to welcome around 700 students to the campus for the summer courses (June-September); in summer 2020 we took the courses fully online in response to the COVID crisis. This case study focuses on the Undergraduate Course.

 

Implementation

Due to the constraints of the time difference between the UK and China, where most students were based, we knew learning on the course would need to be largely asynchronous. However, we were keen to promote active learning and so adopted the following approaches:

  • Use of the online authoring tool Xerte to create interactive learning materials which enabled students to have immediate feedback on tasks.
  • Incorporation of asynchronous peer and student-teacher interaction into the course each week through scaffolded tasks for the Blackboard Discussion Boards.
  • Setting up of small study groups of 3-4 students within each class of 16 students. Each group had fortnightly tutorials with the teacher and were encouraged to use the group for independent peer support.
  • Live online interactive sessions which took a ‘flipped’ approach, so students came prepared to share and discuss their work on a set task and ask any questions.

In order to track engagement with the learning materials we used Blackboard Tests to create short (4-5 questions) ‘Stop & Check’ quizzes at regular intervals throughout the week. We used the Grade Centre to monitor completion of these. We also made use of other student engagement monitoring features of Blackboard, in particular the Retention Centre within Evaluation and Blackboard Course Reports which enable instructors to track a range of user activity.

 

Impact

Our tracking showed that most students were engaging with the tasks daily, as required. We were very quickly able to identify a small group of students who were not engaging as hoped and target additional communication and support to these students.

Student feedback demonstrated that students perceived improvements in their language ability across the four skills (reading, writing, speaking and listening) and this was confirmed by their results at the end of the course. Student outcomes were good with over 90% of students achieving the scores they needed to progress to their chose degree programme. This compares favourably with the progression rate for the on-campus course which has run in previous years.

Feedback from teachers on the learning materials was very positive. One teacher commented that ‘The videos and Xerte lessons were excellent. As a new teacher I felt the course was very clear and it has been the best summer course I have worked on’. Teachers highlighted Xerte, the Discussion Boards and the interactive sessions as strengths of the course.

The materials and overall design of the course have informed the Pre-sessional Course (PSE 1) which is running this Autumn Term.

 

Reflections

Overall, we designed and delivered a course which met our objectives. Some reflections on the tools and approaches we employed are as follows:

Xerte lessons: these were definitely a successful part of the course enabling us to provide interactive asynchronous learning materials with immediate feedback to students. We also found the Xerte lessons enabled us to make coherent ‘packages’ of smaller tasks helping us to keep the Blackboard site uncluttered and easy to navigate.

Discussion Boards: teacher feedback indicated that this was a part of the course some felt was an enhancement of the previous F2F delivery. Points we found were key to the success of Discussion Board tasks were:

  • Creation of a new thread for each task to keep threads a manageable size
  • Linking to the specific thread from the task using hyperlinks
  • Detailed and specific Discussion Board task instructions for students broken down into steps of making an initial post and responding to classmates’ posts with deadlines for each step
  • Teacher presence on the Discussion Board
  • Teacher feedback on group use of the Discussion Board in live sessions to reinforce the importance of peer interaction

Small study groups: these were a helpful element of the course and greater use could have been made of them. For example, one teacher developed a system of having a rotating ‘group leader’ who took responsibility for guiding the group through an assigned task each week. In the future we could incorporate this approach and build more independent group work into the asynchronous learning materials to reinforce the importance of collaboration and peer learning.

Live sessions: student feedback showed clearly that this was an aspect of the course they particularly valued. Both students and teachers felt there should be more live contact but that these do not need to be long sessions; even an additional 30 minutes a day would have made a difference. Teachers and students commented that Teams provided a more stable connection for students in China than Blackboard Collaborate.

Blackboard Tests and monitoring features of Blackboard: these were undoubtedly useful tools for monitoring student engagement. However, they generate a great deal of data which is not always easy to interpret ‘at a glance’ and provides a fairly superficial account of engagement. Most teachers ended up devising their own tracking systems in Excel which enabled them to identify and track performance on certain key tasks each week.

 

Follow up

Taking into account the feedback from this year, materials developed could be used in future to facilitate a flipped learning approach on the course with students studying on campus or remotely. This would address the calls for more teacher-student interaction and enable the course to respond flexibility to external events. Currently, we are applying lessons learnt from the summer to the delivery of our Pre-sessional and Academic English Programmes running this term.

 

Links

The Pre-sessional English and Academic English Programme webpages give more details about the Programmes

Pre-sessional: http://www.reading.ac.uk/ISLI/study-in-the-uk/isli-pre-sessional-english.aspx

Academic English Programme: http://www.reading.ac.uk/ISLI/enhancing-studies/isli-aep.aspx

Misconceptions About Flipped Learning

Misconceptions about Flipped Learning

 

During the COVID-19 pandemic, colleagues in UoR are called to adjust their courses almost overnight from face to face teaching and to fully online ones. As the immediate future is still full of uncertainty, UoR (2020) teaching and learning framework are asking us to be creative in our pedagogical teaching approaches and to come up with strategies that would make courses stimulating and engaging. Flipped learning is one of the approaches suggested in the framework. With that in mind, I have written two articles about flipped learning published here and here.

Flipped learning is a pedagogical approach which comes timely during Covid-19. The advancement of internet technology, online learning platform and social media combined with growing exposure to flipped learning pedagogical approach promote the adoption of flipped learning during this pandemic. However, despite its popularity and published literature about flipped learning, it is evident that there are many misconceptions about it as it remains a somewhat poorly-understood concept among many.

In this last article, I thought I write and share with you some of the misconceptions about flipped learning that I resonate most. At the same time, let us reflect on them and see how we can overcome them if possible. Your feedbacks are always welcome and please do send me your thoughts via w.tew@henley.ac.uk

 

Misconception 1: Flipped learning is about putting video contents online

Reflection: This can be the most popular format to do flipped learning, but it is NOT about putting videos online and having students do homework in class (or online during this pandemic time). Referring to UoR (2020) Teaching and Learning: Framework for Autumn term 2020, we are encouraged to prepare our teaching and lectures in a video format. This format works well with flipped learning instructional strategy for delivering our teaching contents but flipped learning can be about much more than that. Colleagues can opt for videos or just text (readings) materials if they flip their lessons. For example, we can make good use of BB LMS platform to include online reading materials using talis aspire, journal articles, case studies, news that are relevant for our students. In another word, flipped learning does not necessarily use videos entirely.

 

Misconception 2: You need to be in the video

Reflection: This is not necessary the case especially so many of us are just shy and ‘unnatural’ in front of the camera, just how I feel for myself. This is why voice recorded PowerPoint format can be a ‘lifesaver’ to many of us. Having said that, having you in the video adds a personal touch to the learning materials for students. For example, wearing different hats when you are filming your videos make it more interesting to ‘draw’ students’ attention to your contents and lessons. Try it, you probably earn a “Mad hatter” title from your students. Just one of my crazy ideas.

 

Misconception 3: You need to flip your entire module 

ReflectionMany of us assume that we need to flip it for our entire module for entire academic year. NOT entirely necessarily so! The whole idea about flipped learning is to foster student-centred learning and teaching can be personalised to suit the students’ needs and learning pace. Therefore, you can flip just one concept or topic, one entire term or some weeks. Remember, the focus is on the students’ learning needs – one size fits all approach definitely does not fits in a flipped learning environment.

 

Misconception 4Flipped learning is a fad and people has been doing this for years in the past

Reflection: This is what my initial thought when I first come to know about flipped learning. A fad is defined as “a style, activity, or interest that is very popular for a short period of time”, an innovation that never takes hold. Flipped learning is anything but this. The evidence that it is still actively studied and researched today proves that it is not just a fad. Talbert (2017) argued that flipped learning is not just rebranding of old techniques. Flipped learning has its pedagogical framework and values in its effects on learning. In brief, the definition of flipped learning (refer Flipped Learning Network, 2014) has differentiated it with any learning theories.

 

Misconception 5: Flipping the classroom takes too much time

Reflection: To be honest, I do think this is true. Preparing for flipped learning and flipping the lessons involve a lot of energy and time. Based on my own experience, I personally can testify that it can take a significant amount of time. This also subjects to how tech-savvy is the teacher and how much of the teaching content needs to be flipped. However, the fruit of the hard labour and time investment, once designed, it will save time. Irony, isn’t it. That’s my experience. What I am trying to show you that once you have it done, you will be able to use the same content over and over again, year after year. Then, any updating and changes to the contents will not take as much time as creating everything from scratch again.

Finally, I hope you enjoy my series of flipped learning published on this platform. I sincerely urge you to consider flipped learning pedagogical approach during this pandemic and please do not hesitate to be in touch to continue this conversation.

References

Flipped Learning Network (FLN). (2014) The Four Pillars of F-L-I-P™ , Reproducible PDF can be found at www.flippedlearning.org/definition.

Talbert, R (2017) Flipped Learning: A Guide for Higher Education Faculty. Stylus Publishing, LLC

UoR (2020) Teaching and Learning: Framework for Autumn term 2020, available at: https://www.reading.ac.uk/web/files/leadershipgroup/autumn-teaching-proposal-v11.pdf

 

Running Virtual Focus Groups – Investigating the Student Experience of the Academic Tutor System

Amanda Millmore, School of Law

Overview

I wanted to measure the impact of the new Academic Tutor System (ATS) on the students in the School of Law, and capture their experiences, both good and bad, with a view to making improvements. I successfully bid for some small project funding from the ATS Steering Group prior to Covid-19. The obvious difficulty I faced in the lockdown, was how to engage my students and encourage them to get involved in a virtual project. How can students co-produce resources when they are spread around the world?

Objectives

I planned to run focus groups with current students with dual aims:·

  • To find out more about their experiences of the academic tutor system and how we might better meet their needs; and
  • To see if the students could collaboratively develop some resources advising their fellow students how to get the most out of tutoring.

The overall aim being to raise the profile of academic tutoring within the School and the positive benefits it offers to our students, but also to troubleshoot any issues.

Implementation

After exams, I emailed all students in the School of Law to ask them to complete an anonymous online survey about their experiences. Around 10% of the cohort responded.

 

 

 

 

 

 

 

Within that survey I offered students the opportunity to join virtual focus groups. The funding had originally been targeted at providing refreshments as an enticement to get involved, so I redeployed it to offer payment via Campus Jobs for the students’ time (a remarkably easy process). I was conscious that many of our students had lost their part time employment, and it seemed a positive way to help them and encourage involvement. I had 56 volunteers, and randomly selected names, ensuring that I had representation from all year groups.

I ran 2 focus groups virtually using MS Teams, each containing students from different years. This seemed to work well for the 11 students who were all able to join the sessions and recording the sessions online enabled me to obtain a more accurate note which was particularly helpful. I was pleasantly surprised at how the conversation flowed virtually; with no more than 6 students in a group we kept all microphones on, to allow everyone to speak, and I facilitated with some prompts and encouraging quieter participants to offer their opinions.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The students were very forthcoming with advice and their honest experiences. They were clear that a good tutor relationship can make a real and noticeable difference for students and those who had had good experiences were effusive in their praise. They were keen to help me find ways to improve the system for everyone.

Results

The students collaborated to produce their “Top Tips for Getting the Most Out of Your Academic Tutor” which we have created into a postcard to share with new undergraduates, using free design software Canva https://www.canva.com/.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The students also individually made short videos at home of their own top tips, and emailed them to me; I enlisted my teenage son to edit those into 2 short videos, one aimed at postgraduates, one for undergraduates, which I can use as part of induction.

From the project I now have useful data as to how our students use their academic tutor. A thematic analysis of qualitative comments from the questionnaires and focus groups identified 5 key themes:

  • Tutor availability
  • Communication by the tutor
  • School level communication
  • Content of meetings
  • Staffing

From these themes I have drawn up a detailed action plan to be implemented to deal with student concerns.

Impact & Reflections

One of the main messages was that we need to do better at clearly communicating the role of the academic tutor to students and staff.

The students’ advice videos are low-tech but high impact, all recorded in lockdown on their phones from around the world, sharing what they wish they’d known and advising their fellow Law students how to maximise the tutor/tutee relationship. The videos have been shared with the STAR mentor team, the ATS Steering Group and the MCE team now has the original footage, to see if they can be used University-wide.

I am firmly convinced that students are more likely to listen to advice from their peers than from an academic, so am hopeful that the advice postcards and videos will help, particularly if we have a more virtual induction process in the forthcoming academic year.

Ultimately, whilst not the project I initially envisaged, our virtual focus group alternative worked well for my student partners, and they were still able to co-create resources, in a more innovative format than I anticipated. My message to colleagues is to trust the students to know what will work for their fellow students, and don’t be afraid to try something new.

 

Introducing group assessment to improve constructive alignment: impact on teacher and student

Daniela Standen, School Director of Teaching and Learning, ISLI  Alison Nicholson, Honorary Fellow, UoR

Overview

In summer 2018-19 Italian and French in Institution-wide Language Programme, piloted paired Oral exams. The impact of the change is explored below. Although discussed in the context of language assessment, the drivers for change, challenges and outcomes are relevant to any discipline intending to introduce more authentic and collaborative tasks in their assessment mix. Group assessments constitute around 4% of the University Assessment types (EMA data, academic year 2019-20).

Objectives

  • improve constructive alignment between the learning outcomes, the teaching methodology and the assessment process
  • for students to be more relaxed and produce more authentic and spontaneous language
  • make the assessment process more efficient, with the aim to reduce teacher workload

Context

IWLP provides credit-bearing language learning opportunities for students across the University. Around 1,000 students learn a language with IWLP at Reading.

The learning outcomes of the modules talk about the ability to communicate in the language.  The teaching methodology employed favours student–student interaction and collaboration.  In class, students work mostly in pairs or small groups. The exam format, on the other hand, was structured so that a student would interact with the teacher.

The exam was often the first time students would have spoken one-to-one with the teacher. The change in interaction pattern could be intimidating and tended to produce stilted Q&A sessions or interrogations, not communication.

Implementation

Who was affected by the change?

221 Students

8 Teachers

7 Modules

4 Proficiency Levels

2 Languages

What changed?

  • The interlocution pattern changed from teacher-student to student-student, reflecting the normal pattern of in-class interaction
  • The marking criteria changed, so that quality of interaction was better defined and carried higher weight
  • The marking process changed, teachers as well as students were paired. Instead of the examiner re-listening to all the oral exams in order to award a mark, the exams were double staffed. One teacher concentrated on running the exam and marking using holistic marking criteria and the second teacher listened and marked using analytic rating scales

Expected outcomes

  • Students to be more relaxed and produce more authentic and spontaneous language
  • Students to student interaction creates a more relaxed atmosphere
  • Students take longer speaking turns
  • Students use more features of interaction

(Hardi Prasetyo, 2018)

  • For there to be perceived issues of validity and fairness around ‘interlocutor effects’ i.e. how does the competence of the person I am speaking to affect my outcomes. (Galaczi & French, 2011)

 Mitigation

  • Homogeneous pairings, through class observation
  • Include monologic and dialogic assessment tasks
  • Planned teacher intervention
  • Inclusion of communicative and linguistic marking criteria
  • Pairing teachers as well as students, for more robust moderation

Impact

Methods of evaluation

Questionnaires were sent to 32 students who had experienced the previous exam format to enable comparison.  Response rate was 30%, 70% from students of Italian. Responses were consistent across the two languages.

8 Teachers provided verbal or written feedback.

 Students’ Questionnaire Results

Overall students’ feedback was positive.  Students recognised closer alignment between teaching and assessment, and that talking to another student was more natural. They also reported increased opportunities to practise and felt well prepared.

However, they did not feel that the new format improved their opportunity to demonstrate their learning or speaking to a student more relaxing.  The qualitative feedback tells us that this is due to anxieties around pairings.

Teachers’ Feedback

  • Language production was more spontaneous and authentic. One teacher commented ‘it was a much more authentic situation and students really helped each other to communicate’
  • Marking changed from a focus on listening for errors towards rewarding successful communication
  • Workload decreased by up to 30%, for the average student cohort and peaks and troughs of work were better distributed

Reflections

Overall, the impact on both teachers and students was positive. Student reported that they were well briefed and had greater opportunities to practise before the exam. Teachers reported a positive impact on workloads and on the students’ ability to demonstrate they were able to communicate in the language.

However, this was not reflected in the students’ feedback. There is a clear discrepancy in the teachers and students’ perception of how the new format allows students to showcase learning.

Despite mitigating action being taken, students also reported anxiety around ‘interlocutor effect’.  Race (2014) tells us that even when universities have put all possible measures in place to make assessment fair they often fail to communicate this appropriately to students. The next steps should therefore focus on engaging students to bridge this perception gap.

Follow-up

Follow up was planned for the 2019-20 academic cycle but could not take place due to the COVID-19 pandemic.

References

Galaczi & French, in Taylor, L. (ed.), (2011). Examining Speaking: Research and practice in assessing second language speaking. Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo, Dehli, Tokyo, Mexico City: CUP.

Fulcher, G. (2003). Testing Second Language Speaking. Ediburgh: Pearson.

Hardi Prasetyo, A. (2018). Paired Oral Tests: A literature review. LLT Journal: A Journal on Language and Language Teaching, 21(Suppl.), 105-110.

Race, P. (2014) Making Learning happen (3rd ed.), Los Angeles; London: Sage

Race, P. (2015) The lecturer’s toolkit : a practical guide to assessment, learning and teaching (4th ed.), London ; New York, NY : Routledge, Taylor & Francis Group

 

Supporting Transition: Investigating students’ experiences of transferring from University of Reading Malaysia campus (UoRM) to the University of Reading UK campus (UoR)

Daniel Grant, Associate Professor in Clinical Pharmacy & Pharmacy Education, Pharmacy Director of Teaching & Learning & Dr Taniya Sharmeen Research Fellow

 

Click here to read the full report.

 

This slide summarises the project: