Xerte: engaging asynchronous learning tasks with automated feedback

Overview

Jonathan Smith: j.p.smith@reading.ac.uk

 

International Study and Language Institute (ISLI)

This article describes a response to the need to convert paper-based learning materials, designed for use on our predominantly face-to-face summer Pre-sessional programme, to an online format, so that students could work independently and receive automated feedback on tasks, where that is appropriate. A rationale for our approach is given, followed by a discussion of the challenges we faced, the solutions we found and reflections on what we learned from the process.

Objectives

The objectives of the project were broadly to;

  • rethink ways in which learning content and tasks could be presented to students in online learning formats
  • convert paper-based learning materials intended for 80 – 120 hours of learning to online formats
  • make the new online content available to students through Blackboard and monitor usage
  • elicit feedback from students and teaching staff on the impacts of the online content on learning.

It must be emphasized that due to the need to develop a fully online course in approximately 8 weeks, we focused mainly on the first 3 of these objectives.

Context

The move from a predominantly face-to-face summer Pre-sessional programme, with 20 hours/week contact time and some blended-learning elements, to fully-online provision in Summer 2020 presented both threats and opportunities to ISLI.  We realised very early on that it would not be prudent to attempt 20 hours/week of live online teaching and learning, particularly since most of that teaching would be provided by sessional staff, working from home, with some working from outside the UK, where it would be difficult to provide IT support. In addition, almost all students would be working from outside the UK, and we knew there would be connectivity issues that would impact on the effectiveness of live online sessions.  In the end, there were 4 – 5 hours/week of live online classes, which meant that a lot of the core course content had to be covered asynchronously, with students working independently.

We had been working with Xerte, an open-source tool for authoring online learning materials, for about 3 years, creating independent study materials for consolidation and extension of learning based round print materials.  This was an opportunity to put engaging, interactive online learning materials at the heart of the programme.  Here are some of the reasons why we chose Xerte;

  • It allows for inputs (text, audio, video, images), interactive tasks and feedback to be co-located on the same webpage
  • There is a very wide range of interactive task types, including drag-and-drop ordering, categorising and matching tasks, and “hotspot” tasks in which clicking on part of a text or image produces customisable responses.
  • It offers considerable flexibility in planning navigation through learning materials, and in the ways feedback can be presented to learners.
  • Learning materials could be created by academic staff without the need for much training or support.

Xerte was only one of the tools for asynchronous learning that we used on the programme.  We also used stand-alone videos, Discussion Board tasks in Blackboard, asynchronous speaking skills tasks in Flipgrid, and written tasks submitted for formative or summative feedback through Turnitin.  We also included a relatively small number of tasks presented as Word docs or PDFs, with a self-check answer key.

Implementation

We knew that we only had time to convert the paper-based learning materials into an online format, rather than start with a blank canvas, but it very quickly became clear that the highly interactive classroom methodology underlying the paper-based materials would be difficult to translate into a fully-online format with greater emphasis on asynchronous learning and automated feedback.  As much as possible we took a flipped learning approach to maximise efficient use of time in live lessons, but it meant that a lot of content that would normally have been covered in a live lesson had to be repackaged for asynchronous learning.

In April 2020, when we started to plan the fully-online programme, we had a limited number of staff who were able to author in Xerte.  Fortunately, we had developed a self-access training resource which meant that new authors were able to learn how to author with minimal support from ISLI’s TEL staff. A number of sessional staff with experience in online teaching or materials development were redeployed from teaching on the summer programme to materials development.  We provided them with a lot of support in the early stages of materials development; providing models and templates, storyboarding, reviewing drafts together. We also produced a style guide so that we had consistent formatting conventions and presentation standards.

The Xerte lessons were accessed via links in Blackboard, and in the end-of-course evaluations we asked students and teaching staff a number of open and closed questions about their response to Xerte.

Impact

We were not in a position to assess the impact of the Xerte lessons on learning outcomes, as we were unable to differentiate between this and the impacts of other aspects of the programme (e.g. live lessons, teacher feedback on written work).  Students are assessed on the basis of a combination of coursework and formal examinations (discussed by Fiona Orel in other posts to the T&L Exchange), and overall grades at different levels of performance were broadly in line with those in previous years, when the online component of the programme was minimal.

In the end-of-course evaluation, students were asked “How useful did you find the Xerte lessons in helping you improve your skills and knowledge?” 245 students responded to this question: 137 (56%) answered “Very useful”, 105 (43%)  “Quite useful” and 3 (1%) “Not useful”.  The open questions provided a lot of useful information that we are taking into account in revising the programme for 2021.  There were technical issues round playing video for some students, and bugs in some of the tasks; most of these issues were resolved after they were flagged up by students during the course. In other comments, students said that feedback needed to be improved for some tasks, that some of the Xerte lessons were too long, and that we needed to develop a way in which students could quickly return to specific Xerte lessons for review later in the course.

Reflections

We learned a lot, very quickly, about instructional design for online learning.

Instructions for asynchronous online tasks need to be very explicit and unambiguous, because at the time students are using Xerte lessons they are not in a position to check their understanding either with peers or with tutors.  We produced a video and a Xerte lesson aimed at helping students understand how to work with Xerte lessons to exploit their maximum potential for learning.

The same applies to feedback.  In addition, to have value, automated feedback generally (but not always) needs to be detailed, with explanations why specific answers are correct or wrong.  We found, occasionally, that short videos embedded in the feedback were more effective than written feedback.

Theoretically, Xerte will track individual student use and performance, if uploaded as SCORM packages into Blackboard, with grades feeding into Grade Centre.  In practice, this only works well for a limited range of task types.  The most effective way to track engagement was to follow up on Xerte lessons with short Blackboard tests.  This is not an ideal solution, and we are looking at other tracking options (e.g. xAPI).

Over the 4 years we have been working with Xerte, we had occasionally heard suggestions that Xerte was too complex for academics to learn to use.   This emphatically was not our experience over Summer 2020.  A number of new authors were able to develop pedagogically-sound Xerte lessons, using a range of task types, to high presentation standards, with almost no 1-to-1 support from the ISLI TEL team.  We estimate that, on average, new authors need to spend 5 hours learning how to use Xerte before they are able to develop materials at an efficient speed, with minimal support.

Another suggestion was that developing engaging interactive learning materials in Xerte is so time-consuming that it is not cost-effective.  It is time-consuming, but put in a situation in which we felt we had no alternative, we managed to achieve all we set out to achieve.  Covid and the need to develop a fully-online course under pressure of time really focused our minds.  The Xerte lessons will need reviewing, and some will definitely need revision, but we face summer 2021 in a far more resilient, sustainable position than at this time last year.  We learned that it makes sense to plan for a minimum 5-year shelf life for online learning materials, with regular review and updating.

Finally, converting the paper-based materials for online learning forced us to critically assess them in forensic detail, particularly in the ways students would work with those materials.   In the end we did create some new content, particularly in response to changes in the ways that students work online or use technological tools on degree programmes.

Follow up

We are now revising the Xerte lessons, on the basis of what we learned from authoring in Xerte, and the feedback we received from colleagues and students.  In particular, we are working on;

  • ways to better track student usage and performance
  • ways to better integrate learning in Xerte lessons with tasks in live lessons
  • improvements to feedback.

For further information, or if you would like to try out Xerte as an author and with students, please contact j.p.smith@reading.ac.uk, and we can set up a trial account for you on the ISLI installation. If you are already authoring with Xerte, you can also join the UoR Xerte community by asking to be added to the Xerte Users Team.

Links and References

The ISLI authoring website provides advice on instructional design with Xerte, user guides on a range of page types, and showcases a range of Xerte lessons.

The international Xerte community website provides Xerte downloads, news on updates and other developments, and a forum for discussion and advice.

Finally, authored in Xerte, this website provides the most comprehensive showcase of all the different page type available in Xerte, showing its potential functionality across a broad range of disciplines.

Organising a Peer Review Event of Synchronous Online Teaching

 

 

Vicky Collins

v.collins@reading.ac.uk

International Study and Language Institute, ISLI

Overview

Each year tutors on  the Academic English Programme [AEP] run by ISLI participate in an internal show case of tips and best practices as a Peer Review event. This year we  chose the theme of ‘Managing synchronous online teaching’  given the level of challenge it presents and creative solutions emerging  to these.  The event included a pre task, live demos and talks throughs, an external list of resources for participants, and a final compendium of curated videos (recorded from the live delivery) for wider dissemination in ISLI.

Objectives

The main aims of the annual Peer review event are:

  • To draw out good practice and reflect on experiences of teaching on the Academic English Programme [AEP] from the previous year or term
  • To foster a team environment by providing for a network to exchange ideas and experiences

Whilst acknowledging that there are many tutorials and training videos available widely to support online teaching, for this peer review on ‘Managing  synchronous online teaching’ we were keen to draw out  tips and practices developed in situ.

Context

Since September 2020, the whole AEP team has been involved in regular live online teaching. This includes all aspects of the Academic English Programme from discipline specific courses to webinars on particular aspects of academic language and literacy.  Prior to this most of our experience and training for the transition to online T & L  was around asynchronous delivery, so we thought a review and reflection of online synchronous teaching was timely

Implementation

I started by researching sub themes within ‘managing online synchronous teaching’ and developed a list from which to solicit ideas for tips and practices from the teaching team:

  • Opening and closing a session
  • Spaces for collaboration
  • Tools for live interactivity
  • Multitasking during sessions
  • Working with longer texts online
  • Teaching to the void: finding ways of connecting to students
  • Staging & organising of activities

 

Soliciting, categorising and developing initial ideas in advance with colleagues helped avoid duplication and also meant I could allocate suitable timings to presentations/demos .  The event was scheduled for 3 hours with rest breaks and question times and held on Teams. Given the commitment of time required, this peer review event takes place in January each year- before the Spring term gets underway and also to boost morale on return from the festive break.

Prior to the event I set a pre task on the topic of ‘What does live online learning look like for our discipline?’ . A discussion thread was set up and colleagues contributed voluntarily .

I also developed a list of external resources on the topic of online synchronous learning to share with participants after the event.

The Peer review event itself was recorded, with permission, and I then curated the individual presentations into 18  bite sized videos.

Impact

A feedback  survey was issued once all resources had been released. This was completed by 8 out of 12 participants.

The aim of the feedback survey was to gauge satisfaction with the event and impact it had on participants preparedness for live online teaching this term [Spring 2021]

In terms of satisfaction, all participants strongly agreed that the event was effectively organized.

All participants strongly agreed [5] or agreed [3] that the event and resources helped them to feel more prepared. Varying levels of confidence in live online scenarios are reflected in participants’ familiarity with ideas presented by their peers. Most reported that between 25-50% of the activities were new to them, and two reported a greater percentage.

Participants listed a range of the tips and practices presented by peers that they would like to try out this term [Spring 2021]

Reflections

One of the most significant outcomes of this was that despite the plethora of tutorials and training vignettes on you tube for example, which  teaching staff can consult, a genuine account of these tips and practices  in situ  is still much  valued.  My colleagues  were able to discuss with sincerity the pros and  cons of practices they had tried in the past term, and how they hoped to continue building on these.  Comments to the pre task  discussion question were insightful and I feel I learnt much from these as I compiled them into a summary. I felt this approach to peer learning was particularly conducive i.e giving peers a space to contribute to a thread, time to read through other contributions, and then for the facilitator to summarise this so we have a meaningful record of our thoughts and ideas. Not everybody in the team had the confidence to present  or demo tips and practices, and indeed there was no pressure to do so, but the pre task allowed for them to contribute to the event in a another form

Follow up

  • The feedback form included an area for participants to share what ideas from the event they would like to put into practice this term. I will informally review this at the end of the term[Spring 2021] and have set up a discussion thread for peers to comment on these
  • Participants have given permission for their videos to be used more widely in ISLI for teacher development purposes.

Can Online Learning Facilitate Meaningful Interpersonal Connection?

Shelley Harris

shelley.harris@reading.ac.uk

Overview

As part of my role as a Creative Writing lecturer, I link undergraduates with professionals from the publishing industry, offering – among other things – extracurricular events for students in the School of Literature and Languages. In the past, these have broadened students’ understanding of the roles involved in publishing and given them hands-on, CV-friendly experience of the skills required in those roles. The goal is to improve students’ knowledge, confidence and employability rather than secure them a job, though sometimes they are given an unexpected leg-up: after a course in 2018, the visiting editor was so impressed by one of our students that he introduced her to a contact at Hachette.

Whatever the specifics of the event, I always seek to bring those professionals into the room – in part to demystify this competitive sector, and in part because, as a ‘high context’ industry (Hall 1977), it has a historically offered jobs to the privileged: those with cultural capital. My aim is to give all our students the chance to accrue such capital.

 

 

Objectives

My ambition for the online event remained the same as its original iteration: to facilitate meaningful connections between our students and the industry guests.

Context

In Spring 2020 I organised an event for Professional Track which – after a panel discussion – would put students into informal breakout groups with early-career publishing professionals. This sort of personal contact is rare, and hugely beneficial for students with an ambition to work in publishing.

The event was scheduled for April, the tea and cake were ordered – and then lockdown occurred. With some trepidation, I redesigned it as an online experience using Blackboard Collaborate. But could an online event really enable the sorts of human connection offered by a face-to-face meeting?

Implementation

TEL’s one-to-one help sessions were a gamechanger for this project, with TEL advisor Chris Johnson offering expert guidance, including the sorts of troubleshooting tips that make all the difference to an online project. There isn’t enough space here to detail them all, but I would hugely recommend making the most of TEL’s expertise.

On the day, the event began with a conventional panel discussion in which I interviewed the guests (an editor, a publicist, a books marketer and a literary agent’s assistant) about their routes into publishing and their experience of work. Students turned off their mics and video, watched the panel and put questions into the text chat, which I then moderated. Next, I put students into small groups using Collaborate’s ‘Breakout Groups’ feature. Each included one publishing professional. I invited all participants to turn on their cameras and mics so that discussion could be more personal and informal. As facilitator, I moved between groups – not participating, but making sure things were running smoothly.

Impact

To what extent was meaningful interpersonal connection facilitated by this online event? Qualitative feedback from students suggests that the ensuing discussions were fruitful. One respondent said: ‘Engaging with the industry professionals in the smaller groups was something that I found to be particularly helpful’, while another said they appreciated ‘talking to individuals with real experience in the sector I am curious about working in.

As with the previous course, one student benefitted in an immediate way; with a guest speaker offering to show her artwork to a children’s publisher. It was encouraging evidence that remote events can bring people together.

Indeed, there were aspects of the online event that seemed to offer advantages over face-to-face meeting; online, there’s a hierarchy of depersonalisation, from a simulacrum of face-to-face (cameras and mics on) through audio only, to text chat which identifies students by name and finally the anonymity of Collaborate’s whiteboard function. This is hard to reproduce in a bricks-and-mortar seminar room – and it liberates participants.

An example of that liberation came in two of the small group discussions, when talk was slow to start and the guest speakers asked students to put questions into text chat instead. Conversation picked up, and once it was under way, students were invited to activate their cameras and microphones. On reflection, I’d start all small group discussion like this next time. The feedback below (in answer to a question about the ways in which the online event was better than an in-person one) suggests how much safer this can make students feel, and how it can lower inhibitions about joining in.

Reflections

We all accept that in-person encounters offer us ways of connecting to each other that are hard to reproduce online, but the reverse is also true. It’s something our neurodivergent students already know (Satterfield, Lepage and Ladjahasan 2015), but my experience on this project has made me sharply aware of the ways in which all participants stand to benefit.

The ‘Get into Publishing’ event has left me cautiously optimistic about facilitating meaningful social connections in the online environment, and keen to further explore its unique social opportunities. And, as Gilly Salmon (2011) makes clear, those connections are not just ‘extras’ – they are absolutely central to successful remote learning.

Links and References

Hall E T (1977), Beyond Culture. Garden City, NY: Anchor Books

Satterfield D, Lepage C & Ladjahasan N (2015) ‘Preferences for online course delivery methods in higher education for students with autism spectrum disorders’, Procedia Manufacturing, 3, pp. 3651-3656

Salmon G (2011), E-Moderating : The Key to Online Teaching and Learning. New York: Routledge, p36

Improving student assessment literacy & engaging students with rubrics

Dr. Allan Laville

School of Psychology & Clinical Languages Sciences

In this 14 minute video, early rubrics adopter Dr. Allan Laville shares how he and colleagues in Psychology have sought to improve student assessment literacy, and have successfully engaged students with their assessment rubrics by embedding analysis of them into their in-class teaching and by using screencasts, discussion boards and student partnership. Lots of useful ideas and advice – well worth a watch.

Promoting and Tracking Student Engagement on an Online Undergraduate Pre-sessional Course

Sarah Mattin: International Study and Language Institute

Overview

This case study outlines approaches to fostering an active learning environment on the University’s first fully online Undergraduate Pre-sessional Course which ran in Summer 2020 with 170 students. It reports staff and student feedback and reflects on how lessons learnt during the summer can inform ISLI’s continued online delivery this autumn term and beyond.

 

Objectives

  • To design and deliver an online Pre-sessional Course to meet the needs of 170 students studying remotely, mostly in China
  • To promote student engagement in learning activities in an online environment
  • To devise effective mechanisms for tracking student engagement and thus identify students who may require additional support

 

Context

The Pre-sessional Programme (PSE) is an English for Academic Purposes (EAP) and academic skills development programme for degree offer holders who require more study to meet the English Language requirements of their intended programme. The programme runs year-round and, in the summer, has separate UG and PG courses. We would usually expect to welcome around 700 students to the campus for the summer courses (June-September); in summer 2020 we took the courses fully online in response to the COVID crisis. This case study focuses on the Undergraduate Course.

 

Implementation

Due to the constraints of the time difference between the UK and China, where most students were based, we knew learning on the course would need to be largely asynchronous. However, we were keen to promote active learning and so adopted the following approaches:

  • Use of the online authoring tool Xerte to create interactive learning materials which enabled students to have immediate feedback on tasks.
  • Incorporation of asynchronous peer and student-teacher interaction into the course each week through scaffolded tasks for the Blackboard Discussion Boards.
  • Setting up of small study groups of 3-4 students within each class of 16 students. Each group had fortnightly tutorials with the teacher and were encouraged to use the group for independent peer support.
  • Live online interactive sessions which took a ‘flipped’ approach, so students came prepared to share and discuss their work on a set task and ask any questions.

In order to track engagement with the learning materials we used Blackboard Tests to create short (4-5 questions) ‘Stop & Check’ quizzes at regular intervals throughout the week. We used the Grade Centre to monitor completion of these. We also made use of other student engagement monitoring features of Blackboard, in particular the Retention Centre within Evaluation and Blackboard Course Reports which enable instructors to track a range of user activity.

 

Impact

Our tracking showed that most students were engaging with the tasks daily, as required. We were very quickly able to identify a small group of students who were not engaging as hoped and target additional communication and support to these students.

Student feedback demonstrated that students perceived improvements in their language ability across the four skills (reading, writing, speaking and listening) and this was confirmed by their results at the end of the course. Student outcomes were good with over 90% of students achieving the scores they needed to progress to their chose degree programme. This compares favourably with the progression rate for the on-campus course which has run in previous years.

Feedback from teachers on the learning materials was very positive. One teacher commented that ‘The videos and Xerte lessons were excellent. As a new teacher I felt the course was very clear and it has been the best summer course I have worked on’. Teachers highlighted Xerte, the Discussion Boards and the interactive sessions as strengths of the course.

The materials and overall design of the course have informed the Pre-sessional Course (PSE 1) which is running this Autumn Term.

 

Reflections

Overall, we designed and delivered a course which met our objectives. Some reflections on the tools and approaches we employed are as follows:

Xerte lessons: these were definitely a successful part of the course enabling us to provide interactive asynchronous learning materials with immediate feedback to students. We also found the Xerte lessons enabled us to make coherent ‘packages’ of smaller tasks helping us to keep the Blackboard site uncluttered and easy to navigate.

Discussion Boards: teacher feedback indicated that this was a part of the course some felt was an enhancement of the previous F2F delivery. Points we found were key to the success of Discussion Board tasks were:

  • Creation of a new thread for each task to keep threads a manageable size
  • Linking to the specific thread from the task using hyperlinks
  • Detailed and specific Discussion Board task instructions for students broken down into steps of making an initial post and responding to classmates’ posts with deadlines for each step
  • Teacher presence on the Discussion Board
  • Teacher feedback on group use of the Discussion Board in live sessions to reinforce the importance of peer interaction

Small study groups: these were a helpful element of the course and greater use could have been made of them. For example, one teacher developed a system of having a rotating ‘group leader’ who took responsibility for guiding the group through an assigned task each week. In the future we could incorporate this approach and build more independent group work into the asynchronous learning materials to reinforce the importance of collaboration and peer learning.

Live sessions: student feedback showed clearly that this was an aspect of the course they particularly valued. Both students and teachers felt there should be more live contact but that these do not need to be long sessions; even an additional 30 minutes a day would have made a difference. Teachers and students commented that Teams provided a more stable connection for students in China than Blackboard Collaborate.

Blackboard Tests and monitoring features of Blackboard: these were undoubtedly useful tools for monitoring student engagement. However, they generate a great deal of data which is not always easy to interpret ‘at a glance’ and provides a fairly superficial account of engagement. Most teachers ended up devising their own tracking systems in Excel which enabled them to identify and track performance on certain key tasks each week.

 

Follow up

Taking into account the feedback from this year, materials developed could be used in future to facilitate a flipped learning approach on the course with students studying on campus or remotely. This would address the calls for more teacher-student interaction and enable the course to respond flexibility to external events. Currently, we are applying lessons learnt from the summer to the delivery of our Pre-sessional and Academic English Programmes running this term.

 

Links

The Pre-sessional English and Academic English Programme webpages give more details about the Programmes

Pre-sessional: http://www.reading.ac.uk/ISLI/study-in-the-uk/isli-pre-sessional-english.aspx

Academic English Programme: http://www.reading.ac.uk/ISLI/enhancing-studies/isli-aep.aspx

Misconceptions About Flipped Learning

Misconceptions about Flipped Learning

 

During the COVID-19 pandemic, colleagues in UoR are called to adjust their courses almost overnight from face to face teaching and to fully online ones. As the immediate future is still full of uncertainty, UoR (2020) teaching and learning framework are asking us to be creative in our pedagogical teaching approaches and to come up with strategies that would make courses stimulating and engaging. Flipped learning is one of the approaches suggested in the framework. With that in mind, I have written two articles about flipped learning published here and here.

Flipped learning is a pedagogical approach which comes timely during Covid-19. The advancement of internet technology, online learning platform and social media combined with growing exposure to flipped learning pedagogical approach promote the adoption of flipped learning during this pandemic. However, despite its popularity and published literature about flipped learning, it is evident that there are many misconceptions about it as it remains a somewhat poorly-understood concept among many.

In this last article, I thought I write and share with you some of the misconceptions about flipped learning that I resonate most. At the same time, let us reflect on them and see how we can overcome them if possible. Your feedbacks are always welcome and please do send me your thoughts via w.tew@henley.ac.uk

 

Misconception 1: Flipped learning is about putting video contents online

Reflection: This can be the most popular format to do flipped learning, but it is NOT about putting videos online and having students do homework in class (or online during this pandemic time). Referring to UoR (2020) Teaching and Learning: Framework for Autumn term 2020, we are encouraged to prepare our teaching and lectures in a video format. This format works well with flipped learning instructional strategy for delivering our teaching contents but flipped learning can be about much more than that. Colleagues can opt for videos or just text (readings) materials if they flip their lessons. For example, we can make good use of BB LMS platform to include online reading materials using talis aspire, journal articles, case studies, news that are relevant for our students. In another word, flipped learning does not necessarily use videos entirely.

 

Misconception 2: You need to be in the video

Reflection: This is not necessary the case especially so many of us are just shy and ‘unnatural’ in front of the camera, just how I feel for myself. This is why voice recorded PowerPoint format can be a ‘lifesaver’ to many of us. Having said that, having you in the video adds a personal touch to the learning materials for students. For example, wearing different hats when you are filming your videos make it more interesting to ‘draw’ students’ attention to your contents and lessons. Try it, you probably earn a “Mad hatter” title from your students. Just one of my crazy ideas.

 

Misconception 3: You need to flip your entire module 

ReflectionMany of us assume that we need to flip it for our entire module for entire academic year. NOT entirely necessarily so! The whole idea about flipped learning is to foster student-centred learning and teaching can be personalised to suit the students’ needs and learning pace. Therefore, you can flip just one concept or topic, one entire term or some weeks. Remember, the focus is on the students’ learning needs – one size fits all approach definitely does not fits in a flipped learning environment.

 

Misconception 4Flipped learning is a fad and people has been doing this for years in the past

Reflection: This is what my initial thought when I first come to know about flipped learning. A fad is defined as “a style, activity, or interest that is very popular for a short period of time”, an innovation that never takes hold. Flipped learning is anything but this. The evidence that it is still actively studied and researched today proves that it is not just a fad. Talbert (2017) argued that flipped learning is not just rebranding of old techniques. Flipped learning has its pedagogical framework and values in its effects on learning. In brief, the definition of flipped learning (refer Flipped Learning Network, 2014) has differentiated it with any learning theories.

 

Misconception 5: Flipping the classroom takes too much time

Reflection: To be honest, I do think this is true. Preparing for flipped learning and flipping the lessons involve a lot of energy and time. Based on my own experience, I personally can testify that it can take a significant amount of time. This also subjects to how tech-savvy is the teacher and how much of the teaching content needs to be flipped. However, the fruit of the hard labour and time investment, once designed, it will save time. Irony, isn’t it. That’s my experience. What I am trying to show you that once you have it done, you will be able to use the same content over and over again, year after year. Then, any updating and changes to the contents will not take as much time as creating everything from scratch again.

Finally, I hope you enjoy my series of flipped learning published on this platform. I sincerely urge you to consider flipped learning pedagogical approach during this pandemic and please do not hesitate to be in touch to continue this conversation.

References

Flipped Learning Network (FLN). (2014) The Four Pillars of F-L-I-P™ , Reproducible PDF can be found at www.flippedlearning.org/definition.

Talbert, R (2017) Flipped Learning: A Guide for Higher Education Faculty. Stylus Publishing, LLC

UoR (2020) Teaching and Learning: Framework for Autumn term 2020, available at: https://www.reading.ac.uk/web/files/leadershipgroup/autumn-teaching-proposal-v11.pdf

 

How ISLI’s Assessment Team created an online oral exam for the Test of English for Educational Purposes (TEEP)

Fiona Orel– International Study and Language Institute (ISLI)

 

Overview

ISLI’s Test of English for Educational Purposes (TEEP) is administered at the end of pre-sessional courses as a measure of students’ academic English proficiency. The speaking test has traditionally been an academic discussion between two students that is facilitated by an interlocutor and marked by an observer.

This case study outlines the process of creating a version of the TEEP speaking test for 1-1 online delivery.

Objectives

  • To create an online TEEP speaking test that could be administered at the beginning of June to 95 students
  • To ensure reliability and security of results
  • To support students and teachers with the transition

Context

The Pre-sessional English course 3 (PSE 3) started in April during the period of lockdown.  At the end of the course all students sit a TEEP test which includes a test of speaking skills. We realised that we wouldn’t be able to administer the usual two student + two teachers test given the constraints with technology and the changes in teaching and learning which reduced to a certain degree the students’ opportunities for oral interaction and that we would need to develop a new 1-1 test that maintained the validity and reliability of the original TEEP Speaking test.

Implementation

We had two main objectives: to create a valid online 1-1 speaking test, and to make sure that the technology we used to administer the test was simple and straight-forward for both teachers and students, and would have reasonably reliable connectivity in the regions where students were based (China, Middle East and UK).

The first thing we needed to do was to return to our test specifications – what exactly were we hoping to assess through the oral exam? The original face-to-face test had five criteria: overall communication, interaction, fluency, accuracy and range, and intelligibility. We knew that interaction had been impacted by the move online, but decided that the aspect of responding appropriately to others was a crucial aspect of interaction that needed to remain and included this in the ‘overall communication’ criteria. Recognising also that interlocutors would also need to be examiners, we worked on streamlining the criteria to remove redundancy and repetition and to ensure that each block contained the same type of description in the same order thereby making it easier for tutors to skim and recall.

We then worked out exactly what functions and skills in speaking that we wanted to test and how we could do that while mostly working with existing resources. We aligned with the original test specifications by testing students’ ability to:

  • Provide appropriate responses to questions and prompt
  • Describe experiences and things
  • Give and justify an opinion by, for example, stating an opinion, explaining causes and effects, comparing, evaluating.

The test format that enabled this was:

  • Part one: an interview with the student about their studies and experience of studying online
  • Part two: problem solving scenario: Students are introduced to a problem which the teacher screen shares with them and they are given three possible solutions to compare, evaluate and rank most to least effective
  • Part three: abstract discussion building on the talk given in part two

The final stage was trialling a platform to conduct the tests. We had considered Zoom due to its reliability but discounted it due to security concerns. BB Collaborate had connectivity issues in China so we decided to use Teams as connectivity was generally better and students and teachers were familiar with the platform as they had been using it for tutorials. Due to the spread of students over time zones, we decided to spread the speaking tests over three mornings finishing by 11:00 BST on each day. We kept the final slot on Friday free for all teachers to enable rescheduling of tests for any student experiencing issues with connectivity on the day.

Finally, we needed to help teachers and students prepare for the tests. For students, learning materials were produced with videos of a sample test, there was a well-attended webinar to introduce the format and requirements, and the recording of this webinar was made available to all students along with a document on their BB course. This instructed them what to do before test day and what to expect on test day.

The test format and procedures were introduced to teachers with instructions for tasks to do before the test, during the test, and after the test. There was also an examiner’s script prepared with integrated instructions and speech to standardise how the tests were administered. Each test was recorded to ensure security and to enable moderation. All students had to verify their identity at the start of the test. The test recording caused some problems as we knew that the video would have to be downloaded and deleted from Stream before anyone else or the student in the Team meeting who had been tested could access it. For this reason we allowed 40 minutes for each 20 minute interview as downloading was sometimes a lengthy process depending on internet speeds. We had 2 or 3 people available each day to pick up any problems such as a teacher being unwell or having tech issues, and/or a student experiencing problems. This worked well and on the first two days we did have to reschedule a number of tests, fortunately, all worked well on the final day. The teachers were fully committed and worked hard to put students at ease, informal feedback from students was the appreciation of an opportunity to talk 1-1 with a tutor, and tutors said that the test format allowed for plenty of evidence upon which to base a decision.

Impact

The test was successful overall and there were fewer technical issues than we had anticipated. Teachers and students were happy with it as an assessment measure and we were able to award valid and reliable grades.

Working together collaboratively with the teachers and the Programme Director was incredibly rewarding and meant that we had a wide resource base of talent and experience when we did run into any problems.

Reflections

Incredibly detailed planning, the sharing of information across Assessment and Pre-sessional Teams, and much appreciated support from the TEL team helped to make the test a success. Students and teachers had very clear and detailed instructions and knew exactly what was expected and how the tests would be conducted. The sharing of expertise across teams meant that problems were solved quickly and creatively, and it is good to see this practice becoming the norm.

We need to work on the issue of downloading and deleting the video after each test as this caused some anxiety for some teachers with slower internet connection. We also need to have more technical support available, especially on the first day. Most students had tested their equipment as instructed but some who hadn’t experienced issues. It would be even better if a similar activity could be built into the course so that teachers and students experience the test situation before the final test.

Follow up

ISLI’s Assessment Team is now preparing to administer the same tests to a much larger cohort of students at the end of August. We will apply the lessons learned during this administration and work to make the process easier for teachers.

Taking Academic Language and Literacy Courses Online

Dr Karin Whiteside, ISLI

Overview

Alongside its embedded discipline-specific provision, the Academic English Programme (AEP) offers a range of open sign-up academic language and literacy courses each term. This case study outlines the process of rapidly converting the summer term provision online, and reports student feedback and reflections on the experience which will help inform continued online delivery this autumn term.

Objectives

Our aim was to provide academic language and literacy support which, as far as practicably possible, was equivalent in scope and quality to our normal face-to-face offering for the same time of year. In summer term, our provision is particularly important for master’s students working on their dissertations, with high numbers applying for Dissertation & Thesis Writing, but courses such as Core Writing Skills and Academic Grammar also providing important ‘building block’ input needed for competent research writing.

Context

Prior to the COVID crisis, our face-to-face courses on different aspects of written and spoken Academic English have been offered for open application on a first-come-first served basis, with a rolling weekly waiting list. With a maximum of 20 students per class, we have been able to offer interactive, task-based learning involving analysis of target language and communicative situations in context, practice exercises and opportunity for discussion and feedback within a friendly small-group environment.

Implementation

Within an extremely tight turnaround time of four weeks to achieve this, we determined a slightly slimmed down programme of five ‘open-to-all’ online courses –  Academic Grammar, Core Academic Writing Skills, Dissertation & Thesis Writing, Essays: Criticality, Argument, Structure and Listening & Note-taking – and replaced our normal application process with self-enrolment via Blackboard, meaning uncapped numbers could sign up and have access to lessons.

Time restraints meant we had to be pragmatic in terms of where to focus our energies. Conversion of course content online needed to be done in a way that was both effective and sustainable, thinking of the potential continued need for online AEP provision going into 2020/21. We predicted (rightly!) that the process of initially converting small-group interactive learning materials to an online format in which their inductive, task-based qualities were retained would be labour-intensive and time-consuming. Therefore, for the short term (summer 2020) we adopted a primarily asynchronous approach, with a view to increasing the proportion of synchronous interactivity in future iterations once content was in place. In terms of converting face-to-face lessons to online, we found what often worked most effectively was to break down contents of a two-hour face-to-face lesson into 2-3 task-focused online parts, each introduced and concluded with short, narrated PowerPoints/MP4 videos. We determined a weekly release-date for lesson materials on each course, often accompanied by a ‘flipped’ element, labelled ‘Pre-lesson Task’, released a few days prior to the main lesson materials. We set up accompanying weekly Discussion Forums where students could ask questions or make comments, for which there was one ‘live’ hour per week. Apart from Pre-Lesson Tasks, task answers were always made available at the same time as lessons to allow students complete autonomy.

Moving rapidly to online delivery meant not necessarily having the highest specification e-learning tools immediately to hand but instead working creatively to get the best out of existing technologies, including the Blackboard platform, which prior to this term had had a mainly ‘depository’ function in AEP. To ensure ease of navigation, the various attachments involved in creating such lessons needed to be carefully curated by Folder and Item within BB Learning Materials. Key to this was clear naming and sequencing, with accompanying instructions at Folder and Item level.

Impact, Reflections and Follow-up

Positive outcomes of taking the summer AEP provision online have included noticeably higher uptake (e.g. in Academic Grammar, 92 self-enrolments compared to 30 applications in summer term 2018/19) and noticeably higher real engagement (e.g. with an average of 11 students attending the 2018/19 summer face-to-face Academic Grammar class, compared to a high of 57 and average of 38 students accessing each online lesson). Running the courses asynchronously online has meant no waiting lists, allowing access to course content to all students who register interest. It also means that students can continue to join courses and work through materials over the summer vacation period, which is particularly useful for international master’s students working on Dissertations for September submission, and for cohorts overseas such as the IoE master’s students in Guangdong.

In survey responses gathered thus far, response to course content has been largely positive: “It provided me an insight into what is expected structure and criticality. Now that I am writing my essay, I could see the difference”. Students appreciated teacher narration, noticing if it was absent: “I would prefer our teacher to talk and explain the subject in every slide.” The clarity of lesson presentation within Blackboard was also noted: “I think the most impressive part in this course is the way these lessons were arranged in BB as every lessons were explicitly highlighted, divided into parts with relevant tasks and their answers. Thus, I could effectively learn the content consciously and unconsciously.”

There were a range of reactions to our approach to online delivery and to online learning more generally.  52% of students were happy with entirely asynchronous learning, while 48% would have preferred a larger element of real-time interactivity: “Although this lessons ensured the freedom in dealing with the material whenever it was possible, the lack of a live-scheduled contact with the teacher and other students was somewhat dispersive.”; “I prefer face to face in the classroom because it encourages me more to contribute”. In normal circumstances, 34% of students said they would want entirely face-to-face AEP classes, whilst 21% would want a blended provision and 45% would prefer learning to remain entirely online, with positive feedback regarding the flexibility of the online provision: “it’s flexible for students to do it depending on their own time.”; “Don’t change the possibility to work asynchronously. It makes it possible to follow despite being a part time student.” Going forward, we plan to design in regular synchronous elements in the form of webinars which link to the asynchronous spine of each course to respond to students’ requests for more live interactivity. We also plan to revisit and refine our use of Discussion Forums in Blackboard. Whilst engagement of lesson content was high, students made limited use of Q&A Forums. It is hoped that more targeted forums directly linked to flipped tasks will encourage greater engagement with this strand of the online delivery in the future.

Links

The AEP website ‘Courses, Workshops and Webinars’ page, which gives details of this summer term’s courses and what will be on offer in autumn: http://www.reading.ac.uk/ISLI/enhancing-studies/academic-english-programme/isli-aep-courses.aspx

Using Psychological Techniques to get the most out of your Feedback

Zainab Abdulsattar (student – Research Assistant), Tamara Wiehe (staff – PWP Clinical Educator) and Dr Allán Laville, a.laville@reading.ac.uk, (Dean for D&I and Lecturer in Clinical Psychology). School of Psychology and CLS.

Overview

To help Part 3 MSci Applied Psychology students address the emotional aspect of engaging with and interpreting assessment feedback, we have created a Blackboard feedback tool, which draws on self-help strategies used in NHS Mental Health services. This was a TLDF funded project by CQSD and we reflect upon the usefulness of the tool in terms of helping students manage their assessment feedback in a more positive and productive way for both now and the future.

Objectives

  • To explore the barriers to interpreting and implementing feedback through the creation of a feedback-focused tool for Blackboard
  • To transfer aspects of NHS self-help strategies to the tool
  • To acknowledge the emotional aspect of addressing assessment feedback in Higher Education
  • To support students to engage effectively with feedback

Context

Assessment and feedback are continually rated as the lowest item on student surveys despite efforts from staff to address this. Whilst staff can certainly continue to improve on their practices surrounding providing feedback, our efforts turned to how we could improve student engagement in this area. Upon investigation of existing feedback-focused tools, it has become apparent that many do not acknowledge the emotional aspect of addressing assessment feedback. For example, the ‘Development Engagement with Feedback Toolkit (DEFT)’ has useful components like a glossary helping students with academic jargon, but it does not provide resources to help with feedback related stress. The aim was to address the emotional aspect of interpreting feedback in the form of a self-help tool.

Implementation

 Zainab Abdulsattar’s experience:

Firstly, we carried out a literature review on feedback in higher education and the use of self-help resources like cognitive restructuring within the NHS used to treat anxiety and depression. These ideas were taken to the student focus group: to gather students’ thoughts and opinions on what type of resource they would like to help them understand and use their feedback.

Considering ideas from the literature review and the focus group, we established the various components of the tool: purpose of feedback video, problem solving and cognitive restructuring techniques, reflective log and where to go for further support page. Then, we started the creation of our prototype Blackboard tool. At tool creation stage, we worked collaboratively with the TEL team (Maria, Matt and Jacqueline) to help format and launch the tool. Upon launch, students were given access to the tool via Blackboard and a survey to complete once they had explored and used the tool.

Impact

Our prototype Blackboard tool met the main objective of the project, to address the emotional aspect of the interpreting assessment feedback. The cognitive restructuring resource aimed to identify, challenge and re-balance students negative or stressful thoughts related to receiving feedback. Some students reported in the tool survey that they found this technique useful.

As well as this, the examples seemed to help students link their past experiences of not getting a good grade. Students also appreciated the interactive features like the video of the lecturer [addressing the fact that feedback is not a personal attack] and were looking forward to the tool being fully implemented during their next academic year. Overall, the student survey was positive with the addition of some suggestions such as making the tool smart phone friendly and altering the structure of the main page for ease of use.

Reflections

Zainab Abdulsattar’s reflections:

The success of the tool lied in the focus group and literature review contributions because the students’ focus group tool ideas helped to further contribute to the evidence-based self-help ideas gathered from the latter. Importantly, the hope is that the tool can act as an academic aid promoting and improving students’ independence in self-managing feedback in a more positive and productive way. Hopefully this will alleviate feedback-related stress for both now and the future in academic and work settings.

Follow up

In the future, we hope to expand the prototype tool into a more established feedback-focused tool. To make the tool even more use-friendly, we could consider improving the initial main contents page. For example, presenting the options like ‘I want to work on improving x’ then lead on to the appropriate self-help resource instead of simply starting with the resource options [e.g. problem solving, reflective log].

Developing and embedding electronic assessment overviews

Dr Allán Laville, a.laville@reading.ac.uk , Chloe Chessell and Tamara Wiehe

Overview

To develop our assessment practices, we created electronic assessment overviews for all assessments in Part 3 MSci Applied Psychology (Clinical) programme. Here we reflect on the benefits of completing this project via a student-staff partnership as well as the realised benefits for students.

Objectives

  • To create electronic assessment overviews for all 8 assessments in Part 3 MSci Applied Psychology (Clinical).
  • To create the overviews via a student-staff partnership with Chloe Chessell. Chloe is a current PhD student and previous MSci student.

Context

The activity was undertaken due to the complexity of the Part 3 assessments. In particular, the clinical competency assessments have many components and so, only providing an in-class overview has some limitations. The aim was for students to be able to review assessment overviews at any time via Blackboard.

Implementation

Allán Laville (Dean for Diversity and Inclusion) and Tamara Wiehe (MSci Clinical Educator) designed the electronic assessment overview concept and then approached Chloe Chessell to see whether she wanted to take part in the development of these overviews. It was important to include Chloe here as she has lived experience of completing the programme and therefore, can offer unique insight.

Chloe Chessell’s experience

The first stage in assisting with the development of electronic assessment resources for MSci Applied Psychology (Clinical) students involved reflecting upon the information my cohort was provided with during our Psychological Wellbeing Practitioner (PWP) training year. Specifically, this involved reflecting upon information about the assessments that I found particularly helpful; identifying any further information which would have benefitted my understanding of the assessments; and suggesting ways to best utilise screencasts to supplement written information about the assessments. After providing this information, I had the opportunity to review and provide feedback on the screencasts which had been developed by the Clinical Educators.

Impact

Chloe shares her view of the impact of completing this activity:

The screencasts that have been developed added to the information that I had as a student, as this format allows students to review assessment information in their own time, and at their own pace. Screencasts can also be revisited, which may help students to ensure they have met the marking criteria for a specific assessment. Furthermore, embedded videos/links to information to support the development of key writing skills (e.g. critical analysis skills) within these screencasts expand upon the information my cohort received, and will help students to develop these skills at the onset of their PWP training year.

Reflections

Staff reflections: The student-staff partnership was key to the success of the project as we needed to ensure that the student voice was at the forefront. The electronic assessment overviews have been well received by students and we are pleased with the results. Based on this positive experience, we now have a further 4 student-staff projects that are currently being completed and we hope to publish on the T&L Exchange in due course.

Chloe Chessell’s reflections:

I believe that utilising student-staff partnerships to aid course development is crucial, as it enables staff to learn from student’s experiences of receiving course information and their views for course development, whilst ensuring overall course requirements are met. Such partnerships also enable students to engage in their course at a higher level, allowing them to have a role in shaping the course around their needs and experiences.

Follow up

In future, we will aim to include interactive tasks within the screencasts, so students can engage in deep level learning (Marton, 1975). An example could be for students to complete a mind map based on the material that they have reviewed in the electronic assessment overview.