Promoting and Tracking Student Engagement on an Online Undergraduate Pre-sessional Course

Sarah Mattin: International Study and Language Institute

Overview

This case study outlines approaches to fostering an active learning environment on the University’s first fully online Undergraduate Pre-sessional Course which ran in Summer 2020 with 170 students. It reports staff and student feedback and reflects on how lessons learnt during the summer can inform ISLI’s continued online delivery this autumn term and beyond.

 

Objectives

  • To design and deliver an online Pre-sessional Course to meet the needs of 170 students studying remotely, mostly in China
  • To promote student engagement in learning activities in an online environment
  • To devise effective mechanisms for tracking student engagement and thus identify students who may require additional support

 

Context

The Pre-sessional Programme (PSE) is an English for Academic Purposes (EAP) and academic skills development programme for degree offer holders who require more study to meet the English Language requirements of their intended programme. The programme runs year-round and, in the summer, has separate UG and PG courses. We would usually expect to welcome around 700 students to the campus for the summer courses (June-September); in summer 2020 we took the courses fully online in response to the COVID crisis. This case study focuses on the Undergraduate Course.

 

Implementation

Due to the constraints of the time difference between the UK and China, where most students were based, we knew learning on the course would need to be largely asynchronous. However, we were keen to promote active learning and so adopted the following approaches:

  • Use of the online authoring tool Xerte to create interactive learning materials which enabled students to have immediate feedback on tasks.
  • Incorporation of asynchronous peer and student-teacher interaction into the course each week through scaffolded tasks for the Blackboard Discussion Boards.
  • Setting up of small study groups of 3-4 students within each class of 16 students. Each group had fortnightly tutorials with the teacher and were encouraged to use the group for independent peer support.
  • Live online interactive sessions which took a ‘flipped’ approach, so students came prepared to share and discuss their work on a set task and ask any questions.

In order to track engagement with the learning materials we used Blackboard Tests to create short (4-5 questions) ‘Stop & Check’ quizzes at regular intervals throughout the week. We used the Grade Centre to monitor completion of these. We also made use of other student engagement monitoring features of Blackboard, in particular the Retention Centre within Evaluation and Blackboard Course Reports which enable instructors to track a range of user activity.

 

Impact

Our tracking showed that most students were engaging with the tasks daily, as required. We were very quickly able to identify a small group of students who were not engaging as hoped and target additional communication and support to these students.

Student feedback demonstrated that students perceived improvements in their language ability across the four skills (reading, writing, speaking and listening) and this was confirmed by their results at the end of the course. Student outcomes were good with over 90% of students achieving the scores they needed to progress to their chose degree programme. This compares favourably with the progression rate for the on-campus course which has run in previous years.

Feedback from teachers on the learning materials was very positive. One teacher commented that ‘The videos and Xerte lessons were excellent. As a new teacher I felt the course was very clear and it has been the best summer course I have worked on’. Teachers highlighted Xerte, the Discussion Boards and the interactive sessions as strengths of the course.

The materials and overall design of the course have informed the Pre-sessional Course (PSE 1) which is running this Autumn Term.

 

Reflections

Overall, we designed and delivered a course which met our objectives. Some reflections on the tools and approaches we employed are as follows:

Xerte lessons: these were definitely a successful part of the course enabling us to provide interactive asynchronous learning materials with immediate feedback to students. We also found the Xerte lessons enabled us to make coherent ‘packages’ of smaller tasks helping us to keep the Blackboard site uncluttered and easy to navigate.

Discussion Boards: teacher feedback indicated that this was a part of the course some felt was an enhancement of the previous F2F delivery. Points we found were key to the success of Discussion Board tasks were:

  • Creation of a new thread for each task to keep threads a manageable size
  • Linking to the specific thread from the task using hyperlinks
  • Detailed and specific Discussion Board task instructions for students broken down into steps of making an initial post and responding to classmates’ posts with deadlines for each step
  • Teacher presence on the Discussion Board
  • Teacher feedback on group use of the Discussion Board in live sessions to reinforce the importance of peer interaction

Small study groups: these were a helpful element of the course and greater use could have been made of them. For example, one teacher developed a system of having a rotating ‘group leader’ who took responsibility for guiding the group through an assigned task each week. In the future we could incorporate this approach and build more independent group work into the asynchronous learning materials to reinforce the importance of collaboration and peer learning.

Live sessions: student feedback showed clearly that this was an aspect of the course they particularly valued. Both students and teachers felt there should be more live contact but that these do not need to be long sessions; even an additional 30 minutes a day would have made a difference. Teachers and students commented that Teams provided a more stable connection for students in China than Blackboard Collaborate.

Blackboard Tests and monitoring features of Blackboard: these were undoubtedly useful tools for monitoring student engagement. However, they generate a great deal of data which is not always easy to interpret ‘at a glance’ and provides a fairly superficial account of engagement. Most teachers ended up devising their own tracking systems in Excel which enabled them to identify and track performance on certain key tasks each week.

 

Follow up

Taking into account the feedback from this year, materials developed could be used in future to facilitate a flipped learning approach on the course with students studying on campus or remotely. This would address the calls for more teacher-student interaction and enable the course to respond flexibility to external events. Currently, we are applying lessons learnt from the summer to the delivery of our Pre-sessional and Academic English Programmes running this term.

 

Links

The Pre-sessional English and Academic English Programme webpages give more details about the Programmes

Pre-sessional: http://www.reading.ac.uk/ISLI/study-in-the-uk/isli-pre-sessional-english.aspx

Academic English Programme: http://www.reading.ac.uk/ISLI/enhancing-studies/isli-aep.aspx

Misconceptions About Flipped Learning

Misconceptions about Flipped Learning

 

During the COVID-19 pandemic, colleagues in UoR are called to adjust their courses almost overnight from face to face teaching and to fully online ones. As the immediate future is still full of uncertainty, UoR (2020) teaching and learning framework are asking us to be creative in our pedagogical teaching approaches and to come up with strategies that would make courses stimulating and engaging. Flipped learning is one of the approaches suggested in the framework. With that in mind, I have written two articles about flipped learning published here and here.

Flipped learning is a pedagogical approach which comes timely during Covid-19. The advancement of internet technology, online learning platform and social media combined with growing exposure to flipped learning pedagogical approach promote the adoption of flipped learning during this pandemic. However, despite its popularity and published literature about flipped learning, it is evident that there are many misconceptions about it as it remains a somewhat poorly-understood concept among many.

In this last article, I thought I write and share with you some of the misconceptions about flipped learning that I resonate most. At the same time, let us reflect on them and see how we can overcome them if possible. Your feedbacks are always welcome and please do send me your thoughts via w.tew@henley.ac.uk

 

Misconception 1: Flipped learning is about putting video contents online

Reflection: This can be the most popular format to do flipped learning, but it is NOT about putting videos online and having students do homework in class (or online during this pandemic time). Referring to UoR (2020) Teaching and Learning: Framework for Autumn term 2020, we are encouraged to prepare our teaching and lectures in a video format. This format works well with flipped learning instructional strategy for delivering our teaching contents but flipped learning can be about much more than that. Colleagues can opt for videos or just text (readings) materials if they flip their lessons. For example, we can make good use of BB LMS platform to include online reading materials using talis aspire, journal articles, case studies, news that are relevant for our students. In another word, flipped learning does not necessarily use videos entirely.

 

Misconception 2: You need to be in the video

Reflection: This is not necessary the case especially so many of us are just shy and ‘unnatural’ in front of the camera, just how I feel for myself. This is why voice recorded PowerPoint format can be a ‘lifesaver’ to many of us. Having said that, having you in the video adds a personal touch to the learning materials for students. For example, wearing different hats when you are filming your videos make it more interesting to ‘draw’ students’ attention to your contents and lessons. Try it, you probably earn a “Mad hatter” title from your students. Just one of my crazy ideas.

 

Misconception 3: You need to flip your entire module 

ReflectionMany of us assume that we need to flip it for our entire module for entire academic year. NOT entirely necessarily so! The whole idea about flipped learning is to foster student-centred learning and teaching can be personalised to suit the students’ needs and learning pace. Therefore, you can flip just one concept or topic, one entire term or some weeks. Remember, the focus is on the students’ learning needs – one size fits all approach definitely does not fits in a flipped learning environment.

 

Misconception 4Flipped learning is a fad and people has been doing this for years in the past

Reflection: This is what my initial thought when I first come to know about flipped learning. A fad is defined as “a style, activity, or interest that is very popular for a short period of time”, an innovation that never takes hold. Flipped learning is anything but this. The evidence that it is still actively studied and researched today proves that it is not just a fad. Talbert (2017) argued that flipped learning is not just rebranding of old techniques. Flipped learning has its pedagogical framework and values in its effects on learning. In brief, the definition of flipped learning (refer Flipped Learning Network, 2014) has differentiated it with any learning theories.

 

Misconception 5: Flipping the classroom takes too much time

Reflection: To be honest, I do think this is true. Preparing for flipped learning and flipping the lessons involve a lot of energy and time. Based on my own experience, I personally can testify that it can take a significant amount of time. This also subjects to how tech-savvy is the teacher and how much of the teaching content needs to be flipped. However, the fruit of the hard labour and time investment, once designed, it will save time. Irony, isn’t it. That’s my experience. What I am trying to show you that once you have it done, you will be able to use the same content over and over again, year after year. Then, any updating and changes to the contents will not take as much time as creating everything from scratch again.

Finally, I hope you enjoy my series of flipped learning published on this platform. I sincerely urge you to consider flipped learning pedagogical approach during this pandemic and please do not hesitate to be in touch to continue this conversation.

References

Flipped Learning Network (FLN). (2014) The Four Pillars of F-L-I-P™ , Reproducible PDF can be found at www.flippedlearning.org/definition.

Talbert, R (2017) Flipped Learning: A Guide for Higher Education Faculty. Stylus Publishing, LLC

UoR (2020) Teaching and Learning: Framework for Autumn term 2020, available at: https://www.reading.ac.uk/web/files/leadershipgroup/autumn-teaching-proposal-v11.pdf

 

How ISLI’s Assessment Team created an online oral exam for the Test of English for Educational Purposes (TEEP)

Fiona Orel– International Study and Language Institute (ISLI)

 

Overview

ISLI’s Test of English for Educational Purposes (TEEP) is administered at the end of pre-sessional courses as a measure of students’ academic English proficiency. The speaking test has traditionally been an academic discussion between two students that is facilitated by an interlocutor and marked by an observer.

This case study outlines the process of creating a version of the TEEP speaking test for 1-1 online delivery.

Objectives

  • To create an online TEEP speaking test that could be administered at the beginning of June to 95 students
  • To ensure reliability and security of results
  • To support students and teachers with the transition

Context

The Pre-sessional English course 3 (PSE 3) started in April during the period of lockdown.  At the end of the course all students sit a TEEP test which includes a test of speaking skills. We realised that we wouldn’t be able to administer the usual two student + two teachers test given the constraints with technology and the changes in teaching and learning which reduced to a certain degree the students’ opportunities for oral interaction and that we would need to develop a new 1-1 test that maintained the validity and reliability of the original TEEP Speaking test.

Implementation

We had two main objectives: to create a valid online 1-1 speaking test, and to make sure that the technology we used to administer the test was simple and straight-forward for both teachers and students, and would have reasonably reliable connectivity in the regions where students were based (China, Middle East and UK).

The first thing we needed to do was to return to our test specifications – what exactly were we hoping to assess through the oral exam? The original face-to-face test had five criteria: overall communication, interaction, fluency, accuracy and range, and intelligibility. We knew that interaction had been impacted by the move online, but decided that the aspect of responding appropriately to others was a crucial aspect of interaction that needed to remain and included this in the ‘overall communication’ criteria. Recognising also that interlocutors would also need to be examiners, we worked on streamlining the criteria to remove redundancy and repetition and to ensure that each block contained the same type of description in the same order thereby making it easier for tutors to skim and recall.

We then worked out exactly what functions and skills in speaking that we wanted to test and how we could do that while mostly working with existing resources. We aligned with the original test specifications by testing students’ ability to:

  • Provide appropriate responses to questions and prompt
  • Describe experiences and things
  • Give and justify an opinion by, for example, stating an opinion, explaining causes and effects, comparing, evaluating.

The test format that enabled this was:

  • Part one: an interview with the student about their studies and experience of studying online
  • Part two: problem solving scenario: Students are introduced to a problem which the teacher screen shares with them and they are given three possible solutions to compare, evaluate and rank most to least effective
  • Part three: abstract discussion building on the talk given in part two

The final stage was trialling a platform to conduct the tests. We had considered Zoom due to its reliability but discounted it due to security concerns. BB Collaborate had connectivity issues in China so we decided to use Teams as connectivity was generally better and students and teachers were familiar with the platform as they had been using it for tutorials. Due to the spread of students over time zones, we decided to spread the speaking tests over three mornings finishing by 11:00 BST on each day. We kept the final slot on Friday free for all teachers to enable rescheduling of tests for any student experiencing issues with connectivity on the day.

Finally, we needed to help teachers and students prepare for the tests. For students, learning materials were produced with videos of a sample test, there was a well-attended webinar to introduce the format and requirements, and the recording of this webinar was made available to all students along with a document on their BB course. This instructed them what to do before test day and what to expect on test day.

The test format and procedures were introduced to teachers with instructions for tasks to do before the test, during the test, and after the test. There was also an examiner’s script prepared with integrated instructions and speech to standardise how the tests were administered. Each test was recorded to ensure security and to enable moderation. All students had to verify their identity at the start of the test. The test recording caused some problems as we knew that the video would have to be downloaded and deleted from Stream before anyone else or the student in the Team meeting who had been tested could access it. For this reason we allowed 40 minutes for each 20 minute interview as downloading was sometimes a lengthy process depending on internet speeds. We had 2 or 3 people available each day to pick up any problems such as a teacher being unwell or having tech issues, and/or a student experiencing problems. This worked well and on the first two days we did have to reschedule a number of tests, fortunately, all worked well on the final day. The teachers were fully committed and worked hard to put students at ease, informal feedback from students was the appreciation of an opportunity to talk 1-1 with a tutor, and tutors said that the test format allowed for plenty of evidence upon which to base a decision.

Impact

The test was successful overall and there were fewer technical issues than we had anticipated. Teachers and students were happy with it as an assessment measure and we were able to award valid and reliable grades.

Working together collaboratively with the teachers and the Programme Director was incredibly rewarding and meant that we had a wide resource base of talent and experience when we did run into any problems.

Reflections

Incredibly detailed planning, the sharing of information across Assessment and Pre-sessional Teams, and much appreciated support from the TEL team helped to make the test a success. Students and teachers had very clear and detailed instructions and knew exactly what was expected and how the tests would be conducted. The sharing of expertise across teams meant that problems were solved quickly and creatively, and it is good to see this practice becoming the norm.

We need to work on the issue of downloading and deleting the video after each test as this caused some anxiety for some teachers with slower internet connection. We also need to have more technical support available, especially on the first day. Most students had tested their equipment as instructed but some who hadn’t experienced issues. It would be even better if a similar activity could be built into the course so that teachers and students experience the test situation before the final test.

Follow up

ISLI’s Assessment Team is now preparing to administer the same tests to a much larger cohort of students at the end of August. We will apply the lessons learned during this administration and work to make the process easier for teachers.

Taking Academic Language and Literacy Courses Online

Dr Karin Whiteside, ISLI

Overview

Alongside its embedded discipline-specific provision, the Academic English Programme (AEP) offers a range of open sign-up academic language and literacy courses each term. This case study outlines the process of rapidly converting the summer term provision online, and reports student feedback and reflections on the experience which will help inform continued online delivery this autumn term.

Objectives

Our aim was to provide academic language and literacy support which, as far as practicably possible, was equivalent in scope and quality to our normal face-to-face offering for the same time of year. In summer term, our provision is particularly important for master’s students working on their dissertations, with high numbers applying for Dissertation & Thesis Writing, but courses such as Core Writing Skills and Academic Grammar also providing important ‘building block’ input needed for competent research writing.

Context

Prior to the COVID crisis, our face-to-face courses on different aspects of written and spoken Academic English have been offered for open application on a first-come-first served basis, with a rolling weekly waiting list. With a maximum of 20 students per class, we have been able to offer interactive, task-based learning involving analysis of target language and communicative situations in context, practice exercises and opportunity for discussion and feedback within a friendly small-group environment.

Implementation

Within an extremely tight turnaround time of four weeks to achieve this, we determined a slightly slimmed down programme of five ‘open-to-all’ online courses –  Academic Grammar, Core Academic Writing Skills, Dissertation & Thesis Writing, Essays: Criticality, Argument, Structure and Listening & Note-taking – and replaced our normal application process with self-enrolment via Blackboard, meaning uncapped numbers could sign up and have access to lessons.

Time restraints meant we had to be pragmatic in terms of where to focus our energies. Conversion of course content online needed to be done in a way that was both effective and sustainable, thinking of the potential continued need for online AEP provision going into 2020/21. We predicted (rightly!) that the process of initially converting small-group interactive learning materials to an online format in which their inductive, task-based qualities were retained would be labour-intensive and time-consuming. Therefore, for the short term (summer 2020) we adopted a primarily asynchronous approach, with a view to increasing the proportion of synchronous interactivity in future iterations once content was in place. In terms of converting face-to-face lessons to online, we found what often worked most effectively was to break down contents of a two-hour face-to-face lesson into 2-3 task-focused online parts, each introduced and concluded with short, narrated PowerPoints/MP4 videos. We determined a weekly release-date for lesson materials on each course, often accompanied by a ‘flipped’ element, labelled ‘Pre-lesson Task’, released a few days prior to the main lesson materials. We set up accompanying weekly Discussion Forums where students could ask questions or make comments, for which there was one ‘live’ hour per week. Apart from Pre-Lesson Tasks, task answers were always made available at the same time as lessons to allow students complete autonomy.

Moving rapidly to online delivery meant not necessarily having the highest specification e-learning tools immediately to hand but instead working creatively to get the best out of existing technologies, including the Blackboard platform, which prior to this term had had a mainly ‘depository’ function in AEP. To ensure ease of navigation, the various attachments involved in creating such lessons needed to be carefully curated by Folder and Item within BB Learning Materials. Key to this was clear naming and sequencing, with accompanying instructions at Folder and Item level.

Impact, Reflections and Follow-up

Positive outcomes of taking the summer AEP provision online have included noticeably higher uptake (e.g. in Academic Grammar, 92 self-enrolments compared to 30 applications in summer term 2018/19) and noticeably higher real engagement (e.g. with an average of 11 students attending the 2018/19 summer face-to-face Academic Grammar class, compared to a high of 57 and average of 38 students accessing each online lesson). Running the courses asynchronously online has meant no waiting lists, allowing access to course content to all students who register interest. It also means that students can continue to join courses and work through materials over the summer vacation period, which is particularly useful for international master’s students working on Dissertations for September submission, and for cohorts overseas such as the IoE master’s students in Guangdong.

In survey responses gathered thus far, response to course content has been largely positive: “It provided me an insight into what is expected structure and criticality. Now that I am writing my essay, I could see the difference”. Students appreciated teacher narration, noticing if it was absent: “I would prefer our teacher to talk and explain the subject in every slide.” The clarity of lesson presentation within Blackboard was also noted: “I think the most impressive part in this course is the way these lessons were arranged in BB as every lessons were explicitly highlighted, divided into parts with relevant tasks and their answers. Thus, I could effectively learn the content consciously and unconsciously.”

There were a range of reactions to our approach to online delivery and to online learning more generally.  52% of students were happy with entirely asynchronous learning, while 48% would have preferred a larger element of real-time interactivity: “Although this lessons ensured the freedom in dealing with the material whenever it was possible, the lack of a live-scheduled contact with the teacher and other students was somewhat dispersive.”; “I prefer face to face in the classroom because it encourages me more to contribute”. In normal circumstances, 34% of students said they would want entirely face-to-face AEP classes, whilst 21% would want a blended provision and 45% would prefer learning to remain entirely online, with positive feedback regarding the flexibility of the online provision: “it’s flexible for students to do it depending on their own time.”; “Don’t change the possibility to work asynchronously. It makes it possible to follow despite being a part time student.” Going forward, we plan to design in regular synchronous elements in the form of webinars which link to the asynchronous spine of each course to respond to students’ requests for more live interactivity. We also plan to revisit and refine our use of Discussion Forums in Blackboard. Whilst engagement of lesson content was high, students made limited use of Q&A Forums. It is hoped that more targeted forums directly linked to flipped tasks will encourage greater engagement with this strand of the online delivery in the future.

Links

The AEP website ‘Courses, Workshops and Webinars’ page, which gives details of this summer term’s courses and what will be on offer in autumn: http://www.reading.ac.uk/ISLI/enhancing-studies/academic-english-programme/isli-aep-courses.aspx

Using Psychological Techniques to get the most out of your Feedback

Zainab Abdulsattar (student – Research Assistant), Tamara Wiehe (staff – PWP Clinical Educator) and Dr Allán Laville, a.laville@reading.ac.uk, (Dean for D&I and Lecturer in Clinical Psychology). School of Psychology and CLS.

Overview

To help Part 3 MSci Applied Psychology students address the emotional aspect of engaging with and interpreting assessment feedback, we have created a Blackboard feedback tool, which draws on self-help strategies used in NHS Mental Health services. This was a TLDF funded project by CQSD and we reflect upon the usefulness of the tool in terms of helping students manage their assessment feedback in a more positive and productive way for both now and the future.

Objectives

  • To explore the barriers to interpreting and implementing feedback through the creation of a feedback-focused tool for Blackboard
  • To transfer aspects of NHS self-help strategies to the tool
  • To acknowledge the emotional aspect of addressing assessment feedback in Higher Education
  • To support students to engage effectively with feedback

Context

Assessment and feedback are continually rated as the lowest item on student surveys despite efforts from staff to address this. Whilst staff can certainly continue to improve on their practices surrounding providing feedback, our efforts turned to how we could improve student engagement in this area. Upon investigation of existing feedback-focused tools, it has become apparent that many do not acknowledge the emotional aspect of addressing assessment feedback. For example, the ‘Development Engagement with Feedback Toolkit (DEFT)’ has useful components like a glossary helping students with academic jargon, but it does not provide resources to help with feedback related stress. The aim was to address the emotional aspect of interpreting feedback in the form of a self-help tool.

Implementation

 Zainab Abdulsattar’s experience:

Firstly, we carried out a literature review on feedback in higher education and the use of self-help resources like cognitive restructuring within the NHS used to treat anxiety and depression. These ideas were taken to the student focus group: to gather students’ thoughts and opinions on what type of resource they would like to help them understand and use their feedback.

Considering ideas from the literature review and the focus group, we established the various components of the tool: purpose of feedback video, problem solving and cognitive restructuring techniques, reflective log and where to go for further support page. Then, we started the creation of our prototype Blackboard tool. At tool creation stage, we worked collaboratively with the TEL team (Maria, Matt and Jacqueline) to help format and launch the tool. Upon launch, students were given access to the tool via Blackboard and a survey to complete once they had explored and used the tool.

Impact

Our prototype Blackboard tool met the main objective of the project, to address the emotional aspect of the interpreting assessment feedback. The cognitive restructuring resource aimed to identify, challenge and re-balance students negative or stressful thoughts related to receiving feedback. Some students reported in the tool survey that they found this technique useful.

As well as this, the examples seemed to help students link their past experiences of not getting a good grade. Students also appreciated the interactive features like the video of the lecturer [addressing the fact that feedback is not a personal attack] and were looking forward to the tool being fully implemented during their next academic year. Overall, the student survey was positive with the addition of some suggestions such as making the tool smart phone friendly and altering the structure of the main page for ease of use.

Reflections

Zainab Abdulsattar’s reflections:

The success of the tool lied in the focus group and literature review contributions because the students’ focus group tool ideas helped to further contribute to the evidence-based self-help ideas gathered from the latter. Importantly, the hope is that the tool can act as an academic aid promoting and improving students’ independence in self-managing feedback in a more positive and productive way. Hopefully this will alleviate feedback-related stress for both now and the future in academic and work settings.

Follow up

In the future, we hope to expand the prototype tool into a more established feedback-focused tool. To make the tool even more use-friendly, we could consider improving the initial main contents page. For example, presenting the options like ‘I want to work on improving x’ then lead on to the appropriate self-help resource instead of simply starting with the resource options [e.g. problem solving, reflective log].

Developing and embedding electronic assessment overviews

Dr Allán Laville, a.laville@reading.ac.uk , Chloe Chessell and Tamara Wiehe

Overview

To develop our assessment practices, we created electronic assessment overviews for all assessments in Part 3 MSci Applied Psychology (Clinical) programme. Here we reflect on the benefits of completing this project via a student-staff partnership as well as the realised benefits for students.

Objectives

  • To create electronic assessment overviews for all 8 assessments in Part 3 MSci Applied Psychology (Clinical).
  • To create the overviews via a student-staff partnership with Chloe Chessell. Chloe is a current PhD student and previous MSci student.

Context

The activity was undertaken due to the complexity of the Part 3 assessments. In particular, the clinical competency assessments have many components and so, only providing an in-class overview has some limitations. The aim was for students to be able to review assessment overviews at any time via Blackboard.

Implementation

Allán Laville (Dean for Diversity and Inclusion) and Tamara Wiehe (MSci Clinical Educator) designed the electronic assessment overview concept and then approached Chloe Chessell to see whether she wanted to take part in the development of these overviews. It was important to include Chloe here as she has lived experience of completing the programme and therefore, can offer unique insight.

Chloe Chessell’s experience

The first stage in assisting with the development of electronic assessment resources for MSci Applied Psychology (Clinical) students involved reflecting upon the information my cohort was provided with during our Psychological Wellbeing Practitioner (PWP) training year. Specifically, this involved reflecting upon information about the assessments that I found particularly helpful; identifying any further information which would have benefitted my understanding of the assessments; and suggesting ways to best utilise screencasts to supplement written information about the assessments. After providing this information, I had the opportunity to review and provide feedback on the screencasts which had been developed by the Clinical Educators.

Impact

Chloe shares her view of the impact of completing this activity:

The screencasts that have been developed added to the information that I had as a student, as this format allows students to review assessment information in their own time, and at their own pace. Screencasts can also be revisited, which may help students to ensure they have met the marking criteria for a specific assessment. Furthermore, embedded videos/links to information to support the development of key writing skills (e.g. critical analysis skills) within these screencasts expand upon the information my cohort received, and will help students to develop these skills at the onset of their PWP training year.

Reflections

Staff reflections: The student-staff partnership was key to the success of the project as we needed to ensure that the student voice was at the forefront. The electronic assessment overviews have been well received by students and we are pleased with the results. Based on this positive experience, we now have a further 4 student-staff projects that are currently being completed and we hope to publish on the T&L Exchange in due course.

Chloe Chessell’s reflections:

I believe that utilising student-staff partnerships to aid course development is crucial, as it enables staff to learn from student’s experiences of receiving course information and their views for course development, whilst ensuring overall course requirements are met. Such partnerships also enable students to engage in their course at a higher level, allowing them to have a role in shaping the course around their needs and experiences.

Follow up

In future, we will aim to include interactive tasks within the screencasts, so students can engage in deep level learning (Marton, 1975). An example could be for students to complete a mind map based on the material that they have reviewed in the electronic assessment overview.

Using personal capture to supplement lectures and address FAQs

Will Hughes – School of Built Environment (Construction Management & Engineering)

Link back to case studies on the T and L Exchange website

Overview

The personal capture pilot project helped me to develop and test ideas to advance what I had been previously trying using YouTube. One important lesson for me was that shorter duration videos better engage students. I also learned how to record videos featuring more than simply a talking head. Using this technology for augmenting the usual pedagogic techniques was very useful. I would like to replace some of my lecturing using screen-cast videos, but I have learned that there is more to this than simply recording pre-prepared lectures for my students.

Objectives

My aim was to produce detailed explanations of points too elementary or too complex to address in lectures and to replace some one-to-one meetings. I aspired to produce a series of 5-10 minute videos that responded to specific student questions generated from lectures and emails. One specific idea was to support reflective portfolio writing.

Context

My motivation to join the personal capture project was to acquire screen-casting skills and to better understand the technology.

There were two key groups I chose to produce recordings for:

  • 40 MSc students, of whom some were flexible-modular and off-campus except when there were formal classes. The main module was CEM102: Business of Construction.
  • 142 BSc students on a Part 2 module: CE2CPT Construction Procurement.

Implementation

I tried using the webcam and laptop provided in the pilot. With these, I made some videos using the Mediasite tool, but the video and audio quality were not as high as I would have liked and the editing offered by Mediasite was very primitive, with no opportunity to fix issues like colour grading, for example. I preferred using my own professional-grade camera, microphone and lighting. I realised that I needed much better software than Mediasite and bought a license for Camtasia, which opened up a lot of interesting possibilities and made it possible to achieve what I had in mind.

Dialogue with students was around presenting them with a video and asking them to let me know what they thought, whether it helped and what kind of things they would like me to cover in future.

Impact

The most well-received videos were those that summarised assignment guidance in 10-11 minutes. My video on research conceptualization proved popular. The assignment summaries in CEM102 Business of Construction, for a Reflective Portfolio and a Case Study, were very impactful and prompted a lot of student approval.

One unanticipated experience was in using the technology for replacing a lecture cancelled due to bad weather; 66% of the students accessed this 55-minute lecture but for an average view time of only 18 minutes which I found to be a depressing statistic.

Reflections

Things improved as I progressed. Planned use of personal capture was much better than using it to overcome lecture cancellations. The pedagogical challenge is to figure out how to produce short videos that are useful to students. It was useful to work out how to provide simple overviews of things that would be helpful in the students’ learning and produce short videos based on this. I found filming at home better than filming in the office. I have learned the importance of issuing reminders about Blackboard-posted videos as students can miss the initial announcement and then never see the video produced for them.

I found the Mediasite tool itself clunky and challenging in terms of its permissions, lack of utility and quality.

Follow up

I still believe personal capture is useful but I am thinking about changing my strategies for how to use it. The changes are not technical put pedagogical. As I move to part-time working and ahve less contact time with students, personal capture may become indispensable for me.

‘A-level Study Boost: Unseen Poetry and the Creative Process’: an online course

Rebecca Bullard, School of Literature and Languages, r.bullard@reading.ac.uk

Overview

‘A-level Study Boost: Unseen Poetry and the Creative Process’ is a two-week online course created by staff and students in the Department of English Literature and the Online Courses team, and hosted on the social learning platform, FutureLearn. It engages a global audience of learners in reading, writing, discussing, and enjoying poetry.

Objectives

The analysis of poetry, sometimes called ‘close reading’ or ‘practical criticism’, is a core skill for the study of English Literature. This course aims to develop this skill in pre- and post-A-level students of English Literature in ways that supplement teaching in schools and FE colleges. In doing so, it encourages students to make a successful transition from A-level to university-level study of English and Creative Writing.

Context

The Online Courses team at UoR approached colleagues in the Department of English Literature to work with them to develop a course that would connect students’ pre-university learning with their studies at UoR. The resulting online course develops learners’ subject-specific skills and gives them insight into what studying English and Creative Writing at university level might be like.

Implementation

Staff in the Online Courses team and Department of English Literature worked together to combine their diverse areas of expertise. Yen Tu, Digital Learning Producer, supported by Sarah Fleming, Assistant Digital Learning Producer, ensured that the course reflects best practice in the pedagogy of online social learning (Sharples 2018; Laudrillard 2014). Rebecca Bullard, as subject specialist, wrote the articles and designed tasks and activities to develop learners’ creative and critical skills.

It took about six months of intensive collaboration to produce the course materials. The first live run of the course took place over two weeks in December 2019. Rebecca and a team of student mentors engaged with learners on the FutureLearn platform throughout the live run to facilitate social learning and encourage completion of the course. The course content, feedback and statistics are currently being evaluated in order to measure impact and inform the next run.

Impact

The impact of the initial run of this course can be evaluated using the UoR Evaluation and Impact Framework (L1: Reach, L2: Reaction, L3: Learning, L4: Behaviour), using course analytics and comments from learners. Some participants gave permission for us to use their comments; where permission was not explicitly given, comments have been paraphrased:

L1: c. 1970 learners from over 100 countries enrolled on the first live run of this course. Comments on completing the course included the following:

L2: “I have always loved poetry but found some modern poems inaccessible. This course [has] shown me some ways to gain access.”

L3/4: “I’m a school teacher, having to teach unseen texts next year. This course has made me enjoy reading and dissecting poetry and I hope that I’ll succeed in inspiring my students to do the same.”

L3/4: One learner commented that the course has changed her perspective on poetry and that she is considering applying to UoR as a result of this course.

Reflections

The success of the course emerged out of the different kinds of collaboration that it involved and encouraged:

Staff-student: The course highlighted the expertise of UoR staff and students, The course videos showcase real teaching methods that are used in the Department of English Literature, and offer tangible evidence of the academic excellence and the outstanding learning experience that underpin the UoR T&L Strategy 2018-21. Current students were paid to work as mentors on the course, giving them confidence in their own expertise.

English Literature-Creative Writing: The course engages learners in both critical analysis and creative practice, reflecting research that indicates the close relationship between these different methods of approaching literary studies (Lockney and Proudfoot 2013).

Department of English Literature-Online Courses: Specialists in both areas drew on their different kinds of expertise to develop a structure, set of activities, tone and style for the course that encourage maximum engagement from learners.

Learner-Educator-Mentor: The social learning platform FutureLearn facilitates active, real-time conversations between Learners, Educators and Mentors, which strengthens and deepens their engagement with the course material.

Follow up

During 2020, further research will be undertaken to evaluate the impact of the course on particular learner groups. The Online Courses team will run a research study to evaluate how teachers (including those in WP areas) are using the course in their teaching. The Department of English Literature will evaluate the impact of the course on students enrolled on EN1PE: Poetry in English.

‘Unseen Poetry’ will be an exemplar for a new ‘A-Level Study Boost’ series which will be rolled out to other Schools across UoR.

Links

‘A-level Study Boost: Unseen Poetry and the Creative Process’: https://www.futurelearn.com/courses/a-level-study-unseen-poetry

References

Laudrillard, Diana. 2014. Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies. Abingdon: Routledge.

Lockney, K. & K. Proudfoot. 2013. ‘Writing the unseen poem: Can the writing of poetry help to support pupils’ engagement in the reading of poetry?’ English in Education 47:2, 147-162.

Sharples, M. 2018. The Pedagogy of FutureLearn: How our learners learn. https://about.futurelearn.com/research-insights/pedagogy-futurelearn-learners-learn

Using personal capture as a method of coaching

Ed Collins – School of Agriculture, Policy & Development

Link back to case studies on the T and L Exchange website

Overview

Personal capture software was used as a method of coaching, facilitating good study practice and identifying milestones for students in order to develop excellent assignments over two modules at undergraduate level. The impact of the project delivered was two-fold. From a student’s perspective, it enabled the students to prepare independently for the various assessments allowing them to re-listen to the advice given. From a lecturer’s perspective, it decreased to amount of face-to-face contact hours whilst maintaining high standards of tutelage.

Objectives

The objective of the project was to offer an enhanced learning experience by adopting online personal capture tools to produce video resources. The focus was on helping the students prepare for the assessments that were associated with the module.

Context

The project used 2 undergraduate modules: a second year Marketing Management module with 120 students and a final year Business strategy module with 70 students. The modules lent themselves to alternative knowledge delivery due the size of the cohort and also the types of assessment. The screen-cast allowed the students to prepare each element in an organised way but also allowed the students a certain degree of flexibility as it reduced the amount of face-to-face tutorials.

Implementation

From the start, the students were involved in the process. Student reps were selected and were consulted after every recording. This helped to get traction from a student point of view but also to get a sense of the reception the recordings were getting. Making the students aware of the recordings was imperative and a follow-up email when they were released was sent out. The recordings were of me delivering to camera (without use of any slides). I chose this format as I felt the students would focus more on what I said and make their own notes, rather than depending on slides.

Impact

The objectives of the project were achieved. From a student’s perspective, they could download and listen to the videos at will and did so repeatedly. The recordings guided the student through the content delivered in the lecture but also through the development of the assessments. Sign-posting readings and suggesting best practice in the development of the assessment formed a structural point of view and formed the main thrust of the message of the recordings. An unexpected outcome was the reduction of face-to-face time I had with my students. There was less demand on my office hours which is both good and bad as I feel it is important to encourage the students to talk to their module leaders outside of class.

Reflections

In my experience, Mediasite (the personal capture software used for the pilot project) did not work as smoothly as I had hoped. As a result, I adopted the software that my Dell computer recording studio offered. I am concerned that students may have unrealistic expectations about the quality of captured recordings. I feel that students are now used to high quality vlogging on YouTube and other platforms and may have an expectation that all videos produced as learning resources in their university experience need to be highly professional.

Follow up

I plan to expand my use of personal capture in my practice to include the marking of scripts and giving students feedback, as well as preparing students for assignments. I will use post-graduate classes to test this in the forthcoming academic year.  I will also be mentoring other staff members to use recordings as much as possible for their courses in the same context as myself during this project.

Use of personal capture to enhance the module selection process in Mathematics and Statistics

Calvin James Smith – Department of Maths & Statistics

Link back to case studies on the T and L Exchange website

Overview

We created short videos advertising the content of modules to enable students to make more informed choices during the module selection process. Staff reported mixed experiences and interest in Mediasite personal capture so other mechanisms were also used (e.g. Camtasia, use of camcorder). Student feedback was positive and did not single out a preferred model of video recording.

Objectives

To create a library of short videos promoting module content to support the module selection process. The library is to be made available via a Blackboard Organisation. Videos should be:

  • Short / concise
  • Reusable
  • Focus on main content of module (not elements whose emphasis depends on staff delivering the module)

Context

Student feedback had revealed that students were feeling there was a lack of guidance and support around module selection, with some students reporting that they only discovered a module wasn’t for them after it was too late to change. Historically, we had provided module selection advice via the tutor system and carousel style talks after the exams periods or in Week 6; however, these mechanisms have experienced declining levels of student participation / efficacy in recent years so a new approach was trialled using the Personal Capture pilot.

We made videos for a wide range of Part 2 and final year modules and made these available in Blackboard alongside “pathway diagrams” showing the pre-requisites linking modules.

Implementation

Maths and Statistics has a mixed relationship with use of screen-casts (typically linked to difficulties in capturing mathematical notation) so it was necessary to develop options for producing videos to enable colleagues to select the mechanism which worked best for them. Working with a colleague, Hannah Fairbanks, we put together two sample videos for MA2MPH (produced using Mediasite) and ST2LM (using a camcorder), and shared these with colleagues alongside an offer of support to produce their own content. No pressure or steer to use one mechanism (Mediasite or camcorder) was provided, rather we prioritised ease of producing AV content in a way colleagues felt comfortable with. Typically, Hannah or I would arrange a time to meet with colleagues and support them one-to-one. In addition, some staff used the Camtasia tool.

We spoke with students continuously throughout the process to receive feedback on what was useful content, both informally and using a feedback survey.

Impact

We created module selection videos for 06 (of 15) Part 2 modules and 10 (of 24) final year modules. These recordings were made available in a Blackboard Organisation called Maths Module Selection, alongside pathway information about how the modules fitted together both in- and between- years, alongside conventional resources such as the module catalogue and programme .

Staff involvement with the Personal Capture pilot did appear to promote additional discussions about inclusive practices and accessibility of resources.

Student was broadly positive indicating that this was a suitable solution to the challenge of supporting their selection of optional modules.

Reflections

I was particularly pleased to be able to provide inclusive module selection support at times that suited students rather than being conditional on staff availability, etc. However, I was unable to convince all colleagues delivering optional modules of the merits of producing these videos so our coverage is not complete; student feedback identified the deficit and has asked for the remaining videos to be produced. It is undeniable that some staff were put off due to the additional burden of producing transcripts in order to meet our accessibility obligations (although we have had some successes using Google Docs to ease production of these).

Staff who had already developed slides for module delivery typically were more willing to engage with the process (talking over these) but otherwise it was challenging to solicit involvement with broad reluctance to engage in ‘talking head’ or being filmed at board activities.

We won’t know if this has been a successful means for supporting module choice until we see a reduction in ‘module tourism’ in the 2019-20 cycle.

Follow up

I’m hoping that now a bank of videos is available that we can “fill in the gaps” on a more leisurely timescale enabling colleagues to contribute without the time pressures of the pilot project.