Using student feedback to make university-directed learning on placement more engaging

Anjali Mehta Chandar: a.m.chandar@reading.ac.uk

Charlie Waller Institute, School of Psychology and Clinical Language Sciences

 

Overview

Our vocational postgraduate courses in Cognitive Behavioural Therapy include University Directed Learning (UDL) days that are completed within the placement setting (e.g. their NHS trust). A qualitative student feedback survey allowed us to collaboratively adapt this format, with favourable outcomes in how interesting, enjoyable and useful the students found the day.

Objectives

Our objectives were as follows:

-To ascertain how interesting, enjoyable and useful the UDL days were, as perceived by the students, based on pedagogical findings that students engage best and are most satisfied, if these characteristics are met (e.g. Ramsden, 2003).

-To make improvements to the UDL days based on qualitative student feedback.

-To ascertain whether our improvements had made the UDL days more interesting, enjoyable and useful, as perceived by the next cohort of students.

Context

The Educational Mental Health Practitioner (EMHP) and Children’s Wellbeing Practitioner (CWP) programmes are one-year vocational postgraduate courses. The students are employed by an NHS trust, local authority or charity, and study at UoR to become qualified mental health practitioners.

UDL days make up a small proportion of the teaching days. They are self-guided teaching days, usually containing elements of e-learning, designed to complement and consolidate face to face teaching (live or remote). A combination of learning methods, including e-learning, is shown to be effective in increasing clinical skills (e.g. Sheikhaboumasoudi et al., 2018).

UDL days had been poorly received by our two 2019/2020 cohorts, according to feedback in the student rep meetings and Mentimeter feedback after each UDL e.g.  comments included: ‘there was too much [content] for one day’, ‘I felt pressured to fill [the form] out rather than focussing on the readings themselves’ and ‘[the reflective form] was too long and too detailed’. Whilst this gave us some ideas on changes to make, I was aware of the low completion rates of the Mentimeter feedback. Therefore, to hear from more voices, we decided to create a specific feedback survey about the UDLs to help us make amendments in a collaborative way.

Implementation

We started by creating a survey for the current students to ascertain their views on how interesting, enjoyable and useful the UDL days were. We also had qualitative questions regarding what they liked and disliked and ideas for specific improvements.

I then led a meeting with my course team to explore the key findings. We agreed to make several changes based on the specific feedback, such as:

– variety of activities (not purely e-learning, but roleplays, videos, self-practice self-reflection tasks, group seminars run by lecturers, etc, to provide a more engaging day)
– fewer activities (we aimed for one main activity for every 1-1.5 hours to manage workload demands)
– an option to complete the programme’s reflective form (designed to be more simple, by asking them to provide their own notes on each task) or provide their notes in a format of their choice (e.g. mindmaps, vlogs, etc) to increase accessibility.
– share these reflections on a discussion board for other students and the lecturer to comment on.

We were unable to implement these changes to the current cohort as they had finished all their UDL days in the timetable, so made the changes for the following cohorts in 2020/2021.

We then sought their feedback via a new survey to ascertain their views on how interesting, enjoyable and useful the UDLs are, with additional questions relating to specific feedback on the new elements.

Impact

The survey results for the newer cohorts were much more positive than the original cohort, after changes were made to the UDL format.

There was a significant increase in how interesting, enjoyable and useful the students found the days.

The trainees also largely agreed that the UDLs had an appropriate workload, e.g. one task per 1-1.5 hours.

They also largely agreed that UDLs included interactive and varied tasks. This finding is in contrast to some of the aforementioned literature of the importance of e-learning, and it must be remembered that too much e-learning can be less engaging for trainees.

The students also praised the simple reflective form as a helpful tool, and many appreciated the option to submit notes in their own preferred way.

Although we neglected to explore the role of the lecturer feedback in the new UDL survey, research shows that this makes for a more engaging e-learning session (Dixson, 2010), and may explain why the UDLs were now more favourable.

Moreover, the process of collecting data from the students via a feedback form seemed effective, in that we used feedback to adapt the specific teaching method, thus improving student satisfaction. Pedagogical research shows the importance of using qualitative questions (instead of, or as well as, quantitative methods) to elicit student feedback (Steyn et al., 2019).

Reflection

Overall, this redesign was successful, which may be down to the fact we used the student voice to make meaningful changes. This is in line with Floden’s (2017) research that student feedback can help to improve courses.

Furthermore, the changes we have made are in line with effective practice amongst other courses and universities, e.g. appropriate workload (Ginn et al., 2007), student choice of discussion format (Lin & Overbaugh, 2007), accessibility of resources (Mahmood et al., 2012) and lecturer interaction (Dixson, 2010).

There is a possible limitation in this case study, in that our more recent cohorts are generally happier on the course, and therefore may be more positive about the UDL. In future projects, it would be useful if we can notice themes within module evaluation/student rep meetings earlier, to then elicit specific survey feedback earlier in the course and make amendments sooner, allowing feedback from the same cohort.

In future variations of the survey, I would also wish to explicitly ask how trainees find sharing reflections on the Blackboard discussion groups, as this is one change we had not elicited feedback on.

Follow Ups

We have continued to utilise these changes in the UDL format with future cohorts,  e.g. reduced workload, variety of activities, simplified forms, choice of discussion format and lecturer interaction. We no longer receive concerns about these days in the student rep meetings since the original cohort. The Mentimeter feedback at the end of each UDL is generally positive, with one person recently commenting: ‘this was a very engaging day’.

References

References:

Dixson, M. D. (2010). Creating effective student engagement in online courses: What do students find engaging?. Journal of the Scholarship of Teaching and Learning, 1-13.

Flodén, J. (2017). The impact of student feedback on teaching in higher education. Assessment & Evaluation in Higher Education42(7), 1054-1068.

Ginns, P., Prosser, M., & Barrie, S. (2007). Students’ perceptions of teaching quality in higher education: The perspective of currently enrolled students. Studies in higher education32(5), 603-615.

Lin, S. Y., & Overbaugh, R. C. (2007). The effect of student choice of online discussion format on tiered achievement and student satisfaction. Journal of Research on technology in Education39(4), 399-415.

Mahmood, A., Mahmood, S. T., & Malik, A. B. (2012). A comparative study of student satisfaction level in distance learning and live classroom at higher education level. Turkish Online Journal of Distance Education13(1), 128-136.

Ramsden, P. (2003). Learning to teach in higher education. Routledge.

Sheikhaboumasoudi, R., Bagheri, M., Hosseini, S. A., Ashouri, E., & Elahi, N. (2018). Improving nursing students’ learning outcomes in fundamentals of nursing course through combination of traditional and e-learning methods. Iranian journal of nursing and midwifery research, 23(3), 217.

Steyn, C., Davies, C., & Sambo, A. (2019). Eliciting student feedback for course development: the application of a qualitative course evaluation tool among business research students. Assessment & Evaluation in Higher Education, 44(1), 11-24.

Links

CWI website: https://sites.reading.ac.uk/charlie-waller-institute/

The One Where a Timetable Merger Gives Rise to a Curriculum Implementation Review

Emma-Jayne Conway, James Kachellek and Tamara Wiehe

t.wiehe@reading.ac.uk

Link back to case studies on the T and L Exchange website

Overview

Staff and students in CWI collaborated on a project initially designed to merge two timetables of sister programmes to aid cross programme working (objective 1) but gave rise to the perfect opportunity to review the way our PWP curriculum is implemented following the pandemic (objective 2). In this blog, we reflect on achieving both objectives within our original timeframe!

Objectives

1.     To create a single timetable to aid cross-programme working for teaching and administrative staff.

2.     To review curriculum implementation including structure and modality on a modular and programme level with all key stakeholders.

Context

In response to a departmental restructure, we required more efficient ways of working across programmes starting with a uniform timetable. Early on, the project evolved to also review the structure and modality of the curriculum. Our two sister PWP training programmes (one undergraduate and one postgraduate) are virtually identical with a few exceptions but historically had been managed separately.

Over the course of 2021, we planned, designed, and implemented a timetable merger for our September cohorts. This impacted on 3 modules (4 for undergraduates) that form the PWP training year for the MSci Applied Psychology (Clinical) students and the Postgraduate/graduate Certificate in Evidence-Based Psychological Treatments (IAPT Pathway).

Taking both Higher Education and Mental Health Care processes into consideration was no easy feat, including those specific to University of Reading (e.g., term dates), our national PWP curriculum specifying the content and learning outcomes for our 26 teaching days and 19 study days, and British Psychological Society (BPS) accreditation requirements. Modality was a particularly important topic throughout this project, taking key learnings from remote delivery during the pandemic as well as awaiting guidance from our professional accrediting body.

Overall, it served as an excellent opportunity to work collaboratively with staff and students to review the implementation of PWP training at the University of Reading.

Implementation

  1. Early 2021: The PWP team met on several occasions to discuss the possibility of merging the two timetables, including transitioning to a “blended” format of online and face-to-face teaching post-Covid. We set out a timeline for planning, designing, and implementing the project.
  2. Advice was sought from the Director of Training in CWI and colleagues in Academic Development, CQSD based on their experience of timetable mergers and a green light was given based on our draft plans!
  3. Several options were considered before the final format was arrived at: Face-to-face teaching is weighted towards the first module/term with progressive increase to the online taught content as the course progresses. (Rationale supplied elsewhere in this blog).
  4. The educator team were able to draw on feedback from online teaching to gauge the attitude of the student body to online learning, as well as expectations and concerns related to a return to the classroom (see Impact, below). The student voice was important in terms of utilising partnership to create meaningful change to curriculum implementation. However, engaging professional practice students via the course reps was a challenge due to time constraints, therefore, we were able to engage graduate instead. This is something we would consider earlier on in future projects.
  5. The educator team unanimously agreed that the externally taught content of the VEC module could be effectively taught with mixed cohorts from the Core-PWP and MSci cohorts using an online approach.
  6. Information on the changes was disseminated to Program Administrators to enable efficient implementation. External Educators were made aware of the retention of online lecture sessions, and the mixed-cohort approach, by the VEC module convenor.
  7. Timetables were updated by the Program Director, in collaboration with Module Convenors; consideration has been given to the potential Workflow impact of closely aligning multiple cohorts (see below). Timetables have been looked at by the team ‘side-by-side,’ to ensure that Workflow balance is maintained for educators across all cohorts. We can continue to monitor the impact on workload while adjustments are made to teaching (such as with the Working Document mentioned in the Follow-Up section, below).
  8. IAPT Services were made aware of the changes to the timetables

Impact

As of October 2021, the merged timetables are proving effective, with no major barriers, having been detected. Predicted barriers included those to effective teaching of (previously face-to-face) content, student/staff dissatisfaction with a blended approach, and significant administrative/technical difficulties.

Face-to-Face teaching resumed in September 2021 and has been a successful return to the classroom. Educators report being able to switch between live sessions and face-to-face teaching days without significant barriers.

The educator team plan to continue to gather feedback on the student experience of the blended and merged approach. We will be able to assess feedback when the first cohorts fully complete in June 2022.

Feedback will be sought from module convenors, educators, and program administrators using “menti” feedback forms, bi-weekly team meetings and informal qualitative discussion, to gauge the impact of the changes on workflow. Student feedback will also be monitored through end-of-module feedback collated by the School.

Reflection

  • The challenge of engaging professional practice students and utilising graduates to overcome this. We will consider setting up graduate/patient advisory group for future projects.
  • Using feedback from a MSci graduate led to timetable changes to ensure readability and clarity for students. This included points such as colour coding F2F v online teaching days, explaining acronyms, etc.
  • Involving all members of the team (especially Module Convenors) felt like a much more meaningful and collaborative process than Programme Director decisions alone. It gave Module Convenors autonomy over their modules as well as aligning learning outcomes across the 3 modules of the programme which is particularly important for clinical training. Other courses may wish to replicate this approach to build team cohesion and allow all colleagues to make meaningful contributions to programme changes and delivery.

Follow up

  • Working document has been created for the educator team to comment on the teaching they have just delivered i.e., was there enough time to deliver all content? This has allowed changes to be made within a matter of weeks as the same day is delivered across the programmes. As a result, we can fine-tune the timetable and delivery of the programme quicky and efficiently to improve the student experience.
  • We will review module by module and at the end of each cohort to continue making any necessary adjustments. Module and programme evaluations, student-staff rep meetings and any feedback from individual teaching days will also help to inform this.

 

Driving programme development in the IOE: student focus groups and paper writing

Jo Anna Reed Johnson – Institute of Education

j.a.reedjohnson@reading.ac.uk

Link back to case studies on the T and L Exchange website

Overview

This article outlines the thinking to drive programme development through student focus groups across three IOE programmes.  The outcome to write a paper and present at a conference helped me to frame this project with a team of academics focusing on changes made during Covid-19 (2020-2021).  This article will share reflections on setting up and running of the focus groups, the delivery of the conference presentation and the final paper writing.  Finally, it will discuss what we have learnt from this and what we will continue to do.

Objectives

  • Share 4 academic perspectives on the redesigning of three modules (SKE, STEAM, PGCE Sec Science) that all have practical elements (laboratory or school), due to Covid-19, by sharing what we did and exploring the student perspectives
  • Show how we designed opportunities for discussion and collaboration when conducting practical work or school related work online
  • Consider the use of student focus groups for programme co-development
  • Reflect on the collaborative nature of paper writing and co-programme reflections

Context

At the IOE there are a range of teacher education programmes, with a practical focus.  The four colleagues engaged in this article were involved with Skills in Schools (ED2TS1 – March to July 2020), SKE (IESKEP and PFTZSKEMATHSA– March to Aug 2020) and PGCE Secondary Science (GFSTSCIENCE – September 2020 to June 2021).  These programmes all require students to work in schools and engage in a science laboratory (if science focused).  As COVID hit in March 2020 we had to think quickly and imaginatively, transforming our provision to be online where required.  Having worked across all three programmes I felt it was pedagogically appropriate to engage our students in the ways we had throughout their learning during the pandemic, where they worked in online communities of practice to reflect.  Thus, we decided to set up a series of focus groups with students reflecting on the impact of the changes and to provide insights for future programme innovations.  This culminated in a conference presentation and paper.

Implementation

The focus was to drive programme development through reflections and shared experiences of academics and students.  I set up a project timeline and MS Team to manage and drive the deliverables, with the end goal to engage students as co-programme developers and to culminate in a conference presentation and paper.  It required framing the project, seeking ethical approval and funding, setting up focus groups to collect data, then reflections and writing up.

Framing the project allowed me to maintain the focus for the redesigning of three modules that all had practical elements (laboratory or school), due to Covid-19.  And then exploring how that had impacted on students through focus groups. It was the conference and paper deadlines that drove this activity and timeline.  At first colleagues wondered why we were writing a paper for a submission related to the School of Architecture (Manchester and Manchester Metropolitan University), but in fact it was because it was about ‘place’.  The remit was a paper related to ‘online education: teaching in a time of change’.

Seeking ethical approval and funding all required knowing where to go and what to do.  Ethical approval required submission of an ethical approval form (including consent form, interview schedule, focus group details) to the IOE ethics committee.  Then applying for funding through the University Travel Grants Scheme – Tasha Easton – e.saxon@reading.ac.uk

Data Collection was initially carried out using MS Forms, for the initial feedback request.  Consent was also required, so where this could not be achieved in person, there was a need to have consent approval attached to the top of the MS Form.  Once participants had consented and those who were willing had indicated taking part in the focus groups, I could set up a series of focus groups across the three programmes, to take place on MS Teams.  We decided to split the four sets of interviews into subject specific groups so that the conversations and reflections could be driven by the students.  One student was nominated as the chair, and they had a template of questions to guide their discussions.

Paper Writing was a challenge as we needed to fit this around our Teaching Focused roles.  I created a writing template after attending an IOE Research and Scholarship Writing Workshop with Professor Alan Floyd.  I scheduled meetings to review, discuss and allocate sections of writing.

The whole process began in December 2020 and continued through to 30 May 2021, with the conference in 21-23 April 2021 (July 2021- Paper Publication).

 

Impact

There were several elements of impact:

  • Working collaboratively with colleagues to reflect on programme development
  • Engaging students as co-programme developers
  • Attending a conference (where funding allowed)
  • Conference paper presentation
  • Conference paper publication

Reflection

In terms of the setting up of focus groups and driving data collection, we learnt that we needed to be organised, and the timeline/plan really helped to keep that focus.  There were times where we were too busy, but we had to create time as we had deliverables to meet.  If we had not have had those deliverables of a conference presentation and paper, we may have let this slip and do it ‘next year’.

Writing the paper was a challenge in that we had not done this together before, and some colleagues had not written an academic paper in a very long time, or even an educational one.  So, creating that writing template and allocating tasks worked.

Gaining conference funding can always be a challenge.  But reaching out and asking was the first thing to do. Then finding out what could be offered at the University/School Level.  Next time, we would all like to attend the conference.  Being an online conference made it more difficult to engage, and I think next time we would plan to all get funding an attend a face-to-face conference so that we too can benefit from being part of the Community of Practice.

What we will continue to do….

  • Develop students as programme co-developers through focus groups, engaging them in the paper writing.
  • Use focus groups to help us (academics) reflect on our own practice and discuss developments across programmes.
  • Drive programme development through the sharing of practices, building communities of practice with timelines and deliverables.

What else will we do…

  • Engage students in the paper writing and conference process.
  • Seek funding to attend a F2F conference with colleagues to allows us time and space to continue to reflect on practice.

Links

Research and Travel Grants Committee: https://www.reading.ac.uk/closed/research/committees/res-researchtravelgrantsubcommittee.aspx

AMPS Conference 21-23 April 2021 – https://architecturemps.com/online-ed-conference/

Merging the Academic Tutor System into Compulsory Core Skills Modules

Lizzy Lander – School of Chemistry Food and Pharmacy

e.r.lander@reading.ac.uk

Link back to case studies on the T and L Exchange website

Overview

This blog will outline the successful integration of core (compulsory) skills modules with the academic tutor system via a curriculum of tutorials designed in this project to be delivered by tutors. This project involved the successful design and scheduling of tutorials for tutors to deliver that supported content (e.g. writing and referencing) in core skills modules to allow better support for student academic skill development and also more closely link tutors into modular taught material.

Objectives

  • Link new academic tutor system with existing Key Skills modules through newly designed academic tutorials discussing core skills to be delivered by tutors.
  • Design this curriculum of tutorials to improve engagement and development of skills at relevant points in the academic year.
  • Design tutorial resources for tutors to ensure consistent support for tutees.
  • Schedule tutorials so tutors and tutees have a place/time to meet in their timetable to facilitate engagement.

Context

SBS had been delivering core skill “Key Skills” modules in parts 1 & 2 for a number of years (since 2015) that focused on academic skills (e.g. writing, referencing,). The introduction of the academic tutor system (2018) with greater focus on academic skills development was closely aligned with the learning outcomes of these modules therefore it was proposed to link the two together.

Implementation

Firstly, an audit of the core skills taught in the Key Skills modules took place to identify which would be most impactful for student development to be reinforced by being integrated into tutorials with academic tutors. Then assignment timetabling was examined to create a schedule of tutorials for the identified skills which allowed practice and formative feedback before assignments, as well as post-assignment feedback to allow students to identify areas of development.

Next, tutorials were formally timetabled, so students viewed these sessions as part of their “normal” academic schedule rather than optional meetings with their tutors.

Afterwards, resources for tutors were created so they could facilitate these tutorials. This consisted of a one to two page pro-forma to inform tutors about the running of the session. Other resources created included materials for activities such as essays to critique as a group. This would also help improve consistency between tutors delivering these sessions as all tutors would have the same session to deliver.

Finally, this project was presented to staff along with details on how resources and information would be disseminated (initially email). Throughout the year tutorials were run by academic tutors directed by the pro-formas and resources with support if needed. Tutors also marked their tutees’ assignments in Key Skills and gave them feedback in their tutorials. Outside of the scheduled tutorials tutors gave one-to-one support for tutees as needed.

Impact

The implementation and influence of these structured and timetabled tutorials was highly effective in supporting tutees in improving academic skills and improving the consistency of tutors engaging with their tutees. Positive impact was clear from the students (surveying parts 1 + 2 in 2019); 53% felt supported/very supported by their tutor and overall, 65% were satisfied/very satisfied with their sessions with 62% finding the summative feedback from tutors helpful and 72% found it useful/very useful to have tutorials in their timetable. Tutors also fed back they have a much clearer idea of what do at tutorials and how best to support student development, whilst valuing the resources provided in this project.

Overall staff and student experience was positively impacted with staff being led and guided to successfully support student development more effectively and consistently.

Reflection

This activity was successful in the way in which it blended together academic tutors,  compulsory modules, as well as assessment and feedback. This generated a platform from which students could learn and practise academic skills for success at university in both a compulsory module and with their tutor through formative and summative feedback. This also helped formalise the role of the tutor for both staff and students giving both groups direction, which ultimately benefitted the students’ academic development. Given that each tutorial had a pro-forma of discussions and activities this helped all tutorials stay consistent so all students got broadly the same development opportunities. Finally, the timetabling of meetings made the tutorials like a normal part of the academic calendar encouraging engagement.

Implementation when supplying information and resources via email to academics was shown to not be the most efficient distribution method and ultimately, some students did not attend tutorials despite reminders of the purpose of these sessions, meaning not all students benefitted from this project.

Follow up

The core outcome of the project in that Key Skills modules would be linked by academic tutorials run by tutors and assessments marked by tutors has continued to be implemented in SBS. However, some alterations have been made for more efficient accessing of materials, by placing resources on a OneDrive that could be accessed at any time. This then evolved into an MS Team to store these resources and also allow tutors to ask questions.

References

Xerte: engaging asynchronous learning tasks with automated feedback

Overview

Jonathan Smith: j.p.smith@reading.ac.uk

 

International Study and Language Institute (ISLI)

This article describes a response to the need to convert paper-based learning materials, designed for use on our predominantly face-to-face summer Pre-sessional programme, to an online format, so that students could work independently and receive automated feedback on tasks, where that is appropriate. A rationale for our approach is given, followed by a discussion of the challenges we faced, the solutions we found and reflections on what we learned from the process.

Objectives

The objectives of the project were broadly to;

  • rethink ways in which learning content and tasks could be presented to students in online learning formats
  • convert paper-based learning materials intended for 80 – 120 hours of learning to online formats
  • make the new online content available to students through Blackboard and monitor usage
  • elicit feedback from students and teaching staff on the impacts of the online content on learning.

It must be emphasized that due to the need to develop a fully online course in approximately 8 weeks, we focused mainly on the first 3 of these objectives.

Context

The move from a predominantly face-to-face summer Pre-sessional programme, with 20 hours/week contact time and some blended-learning elements, to fully-online provision in Summer 2020 presented both threats and opportunities to ISLI.  We realised very early on that it would not be prudent to attempt 20 hours/week of live online teaching and learning, particularly since most of that teaching would be provided by sessional staff, working from home, with some working from outside the UK, where it would be difficult to provide IT support. In addition, almost all students would be working from outside the UK, and we knew there would be connectivity issues that would impact on the effectiveness of live online sessions.  In the end, there were 4 – 5 hours/week of live online classes, which meant that a lot of the core course content had to be covered asynchronously, with students working independently.

We had been working with Xerte, an open-source tool for authoring online learning materials, for about 3 years, creating independent study materials for consolidation and extension of learning based round print materials.  This was an opportunity to put engaging, interactive online learning materials at the heart of the programme.  Here are some of the reasons why we chose Xerte;

  • It allows for inputs (text, audio, video, images), interactive tasks and feedback to be co-located on the same webpage
  • There is a very wide range of interactive task types, including drag-and-drop ordering, categorising and matching tasks, and “hotspot” tasks in which clicking on part of a text or image produces customisable responses.
  • It offers considerable flexibility in planning navigation through learning materials, and in the ways feedback can be presented to learners.
  • Learning materials could be created by academic staff without the need for much training or support.

Xerte was only one of the tools for asynchronous learning that we used on the programme.  We also used stand-alone videos, Discussion Board tasks in Blackboard, asynchronous speaking skills tasks in Flipgrid, and written tasks submitted for formative or summative feedback through Turnitin.  We also included a relatively small number of tasks presented as Word docs or PDFs, with a self-check answer key.

Implementation

We knew that we only had time to convert the paper-based learning materials into an online format, rather than start with a blank canvas, but it very quickly became clear that the highly interactive classroom methodology underlying the paper-based materials would be difficult to translate into a fully-online format with greater emphasis on asynchronous learning and automated feedback.  As much as possible we took a flipped learning approach to maximise efficient use of time in live lessons, but it meant that a lot of content that would normally have been covered in a live lesson had to be repackaged for asynchronous learning.

In April 2020, when we started to plan the fully-online programme, we had a limited number of staff who were able to author in Xerte.  Fortunately, we had developed a self-access training resource which meant that new authors were able to learn how to author with minimal support from ISLI’s TEL staff. A number of sessional staff with experience in online teaching or materials development were redeployed from teaching on the summer programme to materials development.  We provided them with a lot of support in the early stages of materials development; providing models and templates, storyboarding, reviewing drafts together. We also produced a style guide so that we had consistent formatting conventions and presentation standards.

The Xerte lessons were accessed via links in Blackboard, and in the end-of-course evaluations we asked students and teaching staff a number of open and closed questions about their response to Xerte.

Impact

We were not in a position to assess the impact of the Xerte lessons on learning outcomes, as we were unable to differentiate between this and the impacts of other aspects of the programme (e.g. live lessons, teacher feedback on written work).  Students are assessed on the basis of a combination of coursework and formal examinations (discussed by Fiona Orel in other posts to the T&L Exchange), and overall grades at different levels of performance were broadly in line with those in previous years, when the online component of the programme was minimal.

In the end-of-course evaluation, students were asked “How useful did you find the Xerte lessons in helping you improve your skills and knowledge?” 245 students responded to this question: 137 (56%) answered “Very useful”, 105 (43%)  “Quite useful” and 3 (1%) “Not useful”.  The open questions provided a lot of useful information that we are taking into account in revising the programme for 2021.  There were technical issues round playing video for some students, and bugs in some of the tasks; most of these issues were resolved after they were flagged up by students during the course. In other comments, students said that feedback needed to be improved for some tasks, that some of the Xerte lessons were too long, and that we needed to develop a way in which students could quickly return to specific Xerte lessons for review later in the course.

Reflections

We learned a lot, very quickly, about instructional design for online learning.

Instructions for asynchronous online tasks need to be very explicit and unambiguous, because at the time students are using Xerte lessons they are not in a position to check their understanding either with peers or with tutors.  We produced a video and a Xerte lesson aimed at helping students understand how to work with Xerte lessons to exploit their maximum potential for learning.

The same applies to feedback.  In addition, to have value, automated feedback generally (but not always) needs to be detailed, with explanations why specific answers are correct or wrong.  We found, occasionally, that short videos embedded in the feedback were more effective than written feedback.

Theoretically, Xerte will track individual student use and performance, if uploaded as SCORM packages into Blackboard, with grades feeding into Grade Centre.  In practice, this only works well for a limited range of task types.  The most effective way to track engagement was to follow up on Xerte lessons with short Blackboard tests.  This is not an ideal solution, and we are looking at other tracking options (e.g. xAPI).

Over the 4 years we have been working with Xerte, we had occasionally heard suggestions that Xerte was too complex for academics to learn to use.   This emphatically was not our experience over Summer 2020.  A number of new authors were able to develop pedagogically-sound Xerte lessons, using a range of task types, to high presentation standards, with almost no 1-to-1 support from the ISLI TEL team.  We estimate that, on average, new authors need to spend 5 hours learning how to use Xerte before they are able to develop materials at an efficient speed, with minimal support.

Another suggestion was that developing engaging interactive learning materials in Xerte is so time-consuming that it is not cost-effective.  It is time-consuming, but put in a situation in which we felt we had no alternative, we managed to achieve all we set out to achieve.  Covid and the need to develop a fully-online course under pressure of time really focused our minds.  The Xerte lessons will need reviewing, and some will definitely need revision, but we face summer 2021 in a far more resilient, sustainable position than at this time last year.  We learned that it makes sense to plan for a minimum 5-year shelf life for online learning materials, with regular review and updating.

Finally, converting the paper-based materials for online learning forced us to critically assess them in forensic detail, particularly in the ways students would work with those materials.   In the end we did create some new content, particularly in response to changes in the ways that students work online or use technological tools on degree programmes.

Follow up

We are now revising the Xerte lessons, on the basis of what we learned from authoring in Xerte, and the feedback we received from colleagues and students.  In particular, we are working on;

  • ways to better track student usage and performance
  • ways to better integrate learning in Xerte lessons with tasks in live lessons
  • improvements to feedback.

For further information, or if you would like to try out Xerte as an author and with students, please contact j.p.smith@reading.ac.uk, and we can set up a trial account for you on the ISLI installation. If you are already authoring with Xerte, you can also join the UoR Xerte community by asking to be added to the Xerte Users Team.

Links and References

The ISLI authoring website provides advice on instructional design with Xerte, user guides on a range of page types, and showcases a range of Xerte lessons.

The international Xerte community website provides Xerte downloads, news on updates and other developments, and a forum for discussion and advice.

Finally, authored in Xerte, this website provides the most comprehensive showcase of all the different page type available in Xerte, showing its potential functionality across a broad range of disciplines.

Learning to Interpret and Assess Complex and Incomplete Environmental Data

Andrew Wade a.j.wade@reading.ac.uk

Department of Geography and Environmental Sciences

Overview

Field work is well known to improve student confidence and enhance skills and knowledge, yet there is evidence for a decline in field work in Secondary Education, especially amongst A-level Geography students. This is problematic as students are entering Geography and Environmental Science degree programmes with reduced skills and confidence around field-based data collection and interpretation, and this appears to be leading to an apprehension around data collection for dissertations. A simple field-based practical where 47 Part 2 Geography and Environmental Science students tested their own hypotheses about factors that control water infiltration into soils was developed. Improved confidence and appreciation of critical thinking around environmental data was reported in a survey of the student experience. Student coursework demonstrated that attainment was very good, and that skills and critical thinking can be recovered and enhanced with relatively simple, low-cost field-based practical classes that can be readily embedded to scaffold subsequent modules, including the dissertation.

Context

The importance of field work is well established in Geography and Environmental Science as a means of active and peer-to-peer learning. However, students appear to have little confidence in designing their own field work for hypotheses testing when they arrive for Part 1, probably due to a decline in field work in Secondary Education (Kinder 2016, Lambert and Reiss 2014). Within the Geography and Environmental Science programmes, there is a part two, 20 credit ‘Research Training’ module that develops the same skills. However, this research training module and the dissertation are seen by the students as being of high risk in that they perceive a low mark will have a significant negative impact on the overall degree classification. Consequently, students are seemingly risk adverse around field-based projects. The idea here is to make field-based training more commonplace throughout multiple modules through inclusion of relatively simple practical training, so that hypotheses testing, critical thinking and confidence with ‘messy’ environmental data become intuitive and students are at ease with these concepts. In parallel, GES module cohorts have increased in recent years and this is an additional reason to develop simple, low-cost practical classes.

Objectives

The aim of the project was to determine if a simple, field-based practical would help boost student confidence around field data collection and interpretation, and hypotheses testing. The objective was to give the students a safe and supportive environment in which to develop their own hypotheses and method for field data collection, and to learn to interpret often ‘messy’ and ‘complex’ environmental data.

Figure 1: The practical class took place on the hill-slope on campus between the Atmospheric Observatory and Whiteknights Lake on the 28 October 2019 over 4 hours in total.

 

Figure 2: Students used a Decagon Devices Mini-Disc Infiltrometer to measure unsaturated hydraulic conductivity to test their own hypotheses about the factors controlling infiltration

Implementation

A practical was designed where 47 Part 2 students, working in groups of four or five, developed their own hypotheses around the factors controlling rainfall infiltration on a hill-slope in the class room following an in-class briefing, and then tested these hypotheses in the field using Mini Disc infiltrometers (Figs. 1, 2 and 3). There was a further follow-up session where each student spent two hours processing the data collected and was briefed on the coursework write-up.

Figure 3: The students tested hypotheses around distance from the lake, vegetation and soil type, soil moisture and soil compaction. Each student group spent two hours in the field.

Impact

Of 40 students who responded to an on-line survey:

  • 37 agreed the practical helped develop their critical thinking skills around complex and incomplete environmental data;
  • 36 agreed they were now better able to deal with uncertainty in field-based measurements;
    and 38 feel more confident working in the field.

Student quotes included:

  • “The practical was very useful in helping to understand the processes happening as well as being more confident in using the equipment.”
  • “I thought the practical was good as it was another way to process information which tends to work better for me, doing and seeing how it works allows me to gain a higher understanding in the processes”

The majority of students gained first class and upper second-class marks for the project write-up and the reports submitted demonstrated good critical thinking skills in the interpretation of the infiltration measurements. There has been a noticeable increase in the number of students opting for hydrology-based dissertations.

Reflections

Confidence and critical thinking skills can be enhanced with relatively simple, low-cost field-based practicals that scaffold subsequent modules including Research Training for Geographers and Environmental Science, and the dissertation, and focus on hypotheses testing in addition to knowledge acquisition. Each student spent 2 hours in the field on campus and 2 hours processing their data, with further time on the coursework write-up. This seems a reasonable investment in time given the benefits in confidence, skills and knowledge. Embedding such practicals should not replace the larger skills-based modules, such as Research Training, nor should such practical classes replace entirely those that focus more on knowledge acquisition, but these practical classes, where students explore their own ideas, appear to be a useful means to boost student confidence and critical thinking skills at an early stage. The practical was also an excellent means of encouraging peer to peer interaction and learning, and this and similar practical classes have good potential for the integration of home and NUIST students.

Follow up

Embed similar practical classes in part one modules to build confidence at the outset of the degree programme and, at part three, to further enable integration of home and NUIST students.

Links and References

Kinder A. 2016. Geography: The future of fieldwork in schools. Online: http://www.sec-ed.co.uk/best-practice/geography-the-future-of-fieldwork-in-schools/ (Last accessed: 03 Jan 2020).

Lambert D and Reiss MJ. 2014, The place of fieldwork in geography and science qualifications, Institute of Education, University of London. ISBN: 978-1-78277-095-4. pp. 20

The impact of COVID upon practical classes in Part 1 chemistry – an opportunity to redevelop a core module

Philippa Cranwell p.b.cranwell@reading.ac.uk, Jenny Eyley, Jessica Gusthart, Kevin Lovelock and Michael Piperakis

Overview

This article outlines a re-design that was undertaken for the Part 1 autumn/spring chemistry module, CH1PRA, which services approximately 45 students per year. All students complete practical work over 20 weeks of the year. There are four blocks of five weeks of practical work in rotation (introductory, inorganic, organic and physical) and students spend one afternoon (4 hours) in the laboratory per week. The re-design was partly due to COVID, as we were forced to critically look at the experiments the students completed to ensure that the practical skills students developed during the COVID pandemic were relevant for Part 2 and beyond, and to ensure that the assessments students completed could also be stand-alone exercises if COVID prevented the completion of practical work. COVID actually provided us with an opportunity to re-invigorate the course and critically appraise whether the skills that students were developing, and how they were assessed, were still relevant for employers and later study.

Objectives

• Redesign CH1PRA so it was COVID-safe and fulfilled strict accreditation criteria.
• Redesign the experiments so as many students as possible could complete practical work by converting some experiments so they were suitable for completion on the open bench to maximise laboratory capacity
• Redesign assessments so if students missed sessions due to COVID they could still collect credit
• Minimise assessment load on academic staff and students
• Move to a more skills-based assessment paradigm, away from the traditional laboratory report.

Context

As mentioned earlier, the COVID pandemic led to significant difficulties in the provision of a practical class due to restrictions on the number of students allowed within the laboratory; 12 students in the fumehoods and 12 students on the open bench (rather than up to 74 students all using fumehoods previously). Prior to the redesign, each student completed four or five assessments per 5-week block and all of the assessments related to a laboratory-based experiment. In addition, the majority of the assessments required students to complete a pro-forma or a technical report. We noticed that the pro-formas did not encourage students to engage with the experiments as we intended, therefore execution of the experiment was passive. The technical reports placed a significant marking burden upon the academic staff and each rotation had different requirements for the content of the report, leading to confusion and frustration among the students. The reliance of the assessments upon completion of a practical experiment was also deemed high-risk with the advent of COVID, therefore we had to re-think our assessment and practical experiment regime.

Implementation

In most cases, the COVID-safe bench experiments were adapted from existing procedures, allowing processing of 24 students per week (12 on the bench and 12 in the fumehood), with students completing two practical sessions every five weeks. This meant that technical staff did not have to familiarise themselves with new experimental procedures while implementing COVID guidelines. In addition, three online exercises per rotation were developed, requiring the same amount of time as the practical class to complete therefore fulfilling our accreditation requirements. The majority of assessments were linked to the ‘online practicals’, with opportunities for feedback during online drop-in sessions. This meant that if a student had to self-isolate they could still complete the assessments within the deadline, reducing the likelihood of ECF submissions and ensuring all Learning Outcomes would still be met. To reduce assessment burden on staff and students, each 5-week block had three assessment points and where possible one of these assessments was marked automatically, e.g. using a Blackboard quiz. The assessments themselves were designed to be more skills-based, developing the softer skills students would require upon employment or during a placement. To encourage active learning, the use of reflection was embedded into the assessment regime; it was hoped that by critically appraising performance in the laboratory students would remember the skills and techniques that they had learnt better rather than the “see, do, forget” mentality that is prevalent within practical classes.

Examples of assessments include: undertaking data analysis, focussing on clear presentation of data; critical self-reflection of the skills developed during a practical class i.e. “what went well”, “what didn’t go so well”, “what would I do differently?”; critically engaging with a published scientific procedure; and giving a three-minute presentation about a practical scientific technique commonly-encountered in the laboratory.

Impact

Mid-module evaluation was completed using an online form, providing some useful feedback that will be used to improve the student experience next term. The majority of students agreed, or strongly agreed, that staff were friendly and approachable, face-to-face practicals were useful and enjoyable, the course was well-run and the supporting materials were useful. This was heartening to read, as it meant that the adjustments that we had to make to the delivery of laboratory based practicals did not have a negative impact upon the students’ experience and that the re-design was, for the most part, working well. Staff enjoyed marking the varied assessments and the workload was significantly reduced by using Blackboard functionality.

Reflections

To claim that all is perfect with this redesign would be disingenuous, and there was a slight disconnect between what we expected students to achieve from the online practicals and what students were achieving. A number of the students polled disliked the online practical work, with the main reason being that the assessment requirements were unclear. We have addressed by providing additional videos explicitly outlining expectations for the assessments, and ensuring that all students are aware of the drop-in sessions. In addition, we amended the assessments so they are aligned more closely with the face-to-face practical sessions giving students opportunity for informal feedback during the practical class.

In summary, we are happy that the assessments are now more varied and provide students with the skills they will need throughout their degree and upon graduation. In addition, the assessment burden on staff and students has been reduced. Looking forward, we will now consider the experiments themselves and in 2021/22 we will extend the number of hours of practical work that Part 1 students complete and further embed our skill-based approach into the programme.

Follow up

 

Links and References

Misconceptions About Flipped Learning

Misconceptions about Flipped Learning

 

During the COVID-19 pandemic, colleagues in UoR are called to adjust their courses almost overnight from face to face teaching and to fully online ones. As the immediate future is still full of uncertainty, UoR (2020) teaching and learning framework are asking us to be creative in our pedagogical teaching approaches and to come up with strategies that would make courses stimulating and engaging. Flipped learning is one of the approaches suggested in the framework. With that in mind, I have written two articles about flipped learning published here and here.

Flipped learning is a pedagogical approach which comes timely during Covid-19. The advancement of internet technology, online learning platform and social media combined with growing exposure to flipped learning pedagogical approach promote the adoption of flipped learning during this pandemic. However, despite its popularity and published literature about flipped learning, it is evident that there are many misconceptions about it as it remains a somewhat poorly-understood concept among many.

In this last article, I thought I write and share with you some of the misconceptions about flipped learning that I resonate most. At the same time, let us reflect on them and see how we can overcome them if possible. Your feedbacks are always welcome and please do send me your thoughts via w.tew@henley.ac.uk

 

Misconception 1: Flipped learning is about putting video contents online

Reflection: This can be the most popular format to do flipped learning, but it is NOT about putting videos online and having students do homework in class (or online during this pandemic time). Referring to UoR (2020) Teaching and Learning: Framework for Autumn term 2020, we are encouraged to prepare our teaching and lectures in a video format. This format works well with flipped learning instructional strategy for delivering our teaching contents but flipped learning can be about much more than that. Colleagues can opt for videos or just text (readings) materials if they flip their lessons. For example, we can make good use of BB LMS platform to include online reading materials using talis aspire, journal articles, case studies, news that are relevant for our students. In another word, flipped learning does not necessarily use videos entirely.

 

Misconception 2: You need to be in the video

Reflection: This is not necessary the case especially so many of us are just shy and ‘unnatural’ in front of the camera, just how I feel for myself. This is why voice recorded PowerPoint format can be a ‘lifesaver’ to many of us. Having said that, having you in the video adds a personal touch to the learning materials for students. For example, wearing different hats when you are filming your videos make it more interesting to ‘draw’ students’ attention to your contents and lessons. Try it, you probably earn a “Mad hatter” title from your students. Just one of my crazy ideas.

 

Misconception 3: You need to flip your entire module 

ReflectionMany of us assume that we need to flip it for our entire module for entire academic year. NOT entirely necessarily so! The whole idea about flipped learning is to foster student-centred learning and teaching can be personalised to suit the students’ needs and learning pace. Therefore, you can flip just one concept or topic, one entire term or some weeks. Remember, the focus is on the students’ learning needs – one size fits all approach definitely does not fits in a flipped learning environment.

 

Misconception 4Flipped learning is a fad and people has been doing this for years in the past

Reflection: This is what my initial thought when I first come to know about flipped learning. A fad is defined as “a style, activity, or interest that is very popular for a short period of time”, an innovation that never takes hold. Flipped learning is anything but this. The evidence that it is still actively studied and researched today proves that it is not just a fad. Talbert (2017) argued that flipped learning is not just rebranding of old techniques. Flipped learning has its pedagogical framework and values in its effects on learning. In brief, the definition of flipped learning (refer Flipped Learning Network, 2014) has differentiated it with any learning theories.

 

Misconception 5: Flipping the classroom takes too much time

Reflection: To be honest, I do think this is true. Preparing for flipped learning and flipping the lessons involve a lot of energy and time. Based on my own experience, I personally can testify that it can take a significant amount of time. This also subjects to how tech-savvy is the teacher and how much of the teaching content needs to be flipped. However, the fruit of the hard labour and time investment, once designed, it will save time. Irony, isn’t it. That’s my experience. What I am trying to show you that once you have it done, you will be able to use the same content over and over again, year after year. Then, any updating and changes to the contents will not take as much time as creating everything from scratch again.

Finally, I hope you enjoy my series of flipped learning published on this platform. I sincerely urge you to consider flipped learning pedagogical approach during this pandemic and please do not hesitate to be in touch to continue this conversation.

References

Flipped Learning Network (FLN). (2014) The Four Pillars of F-L-I-P™ , Reproducible PDF can be found at www.flippedlearning.org/definition.

Talbert, R (2017) Flipped Learning: A Guide for Higher Education Faculty. Stylus Publishing, LLC

UoR (2020) Teaching and Learning: Framework for Autumn term 2020, available at: https://www.reading.ac.uk/web/files/leadershipgroup/autumn-teaching-proposal-v11.pdf

 

Introducing group assessment to improve constructive alignment: impact on teacher and student

Daniela Standen, School Director of Teaching and Learning, ISLI  Alison Nicholson, Honorary Fellow, UoR

Overview

In summer 2018-19 Italian and French in Institution-wide Language Programme, piloted paired Oral exams. The impact of the change is explored below. Although discussed in the context of language assessment, the drivers for change, challenges and outcomes are relevant to any discipline intending to introduce more authentic and collaborative tasks in their assessment mix. Group assessments constitute around 4% of the University Assessment types (EMA data, academic year 2019-20).

Objectives

  • improve constructive alignment between the learning outcomes, the teaching methodology and the assessment process
  • for students to be more relaxed and produce more authentic and spontaneous language
  • make the assessment process more efficient, with the aim to reduce teacher workload

Context

IWLP provides credit-bearing language learning opportunities for students across the University. Around 1,000 students learn a language with IWLP at Reading.

The learning outcomes of the modules talk about the ability to communicate in the language.  The teaching methodology employed favours student–student interaction and collaboration.  In class, students work mostly in pairs or small groups. The exam format, on the other hand, was structured so that a student would interact with the teacher.

The exam was often the first time students would have spoken one-to-one with the teacher. The change in interaction pattern could be intimidating and tended to produce stilted Q&A sessions or interrogations, not communication.

Implementation

Who was affected by the change?

221 Students

8 Teachers

7 Modules

4 Proficiency Levels

2 Languages

What changed?

  • The interlocution pattern changed from teacher-student to student-student, reflecting the normal pattern of in-class interaction
  • The marking criteria changed, so that quality of interaction was better defined and carried higher weight
  • The marking process changed, teachers as well as students were paired. Instead of the examiner re-listening to all the oral exams in order to award a mark, the exams were double staffed. One teacher concentrated on running the exam and marking using holistic marking criteria and the second teacher listened and marked using analytic rating scales

Expected outcomes

  • Students to be more relaxed and produce more authentic and spontaneous language
  • Students to student interaction creates a more relaxed atmosphere
  • Students take longer speaking turns
  • Students use more features of interaction

(Hardi Prasetyo, 2018)

  • For there to be perceived issues of validity and fairness around ‘interlocutor effects’ i.e. how does the competence of the person I am speaking to affect my outcomes. (Galaczi & French, 2011)

 Mitigation

  • Homogeneous pairings, through class observation
  • Include monologic and dialogic assessment tasks
  • Planned teacher intervention
  • Inclusion of communicative and linguistic marking criteria
  • Pairing teachers as well as students, for more robust moderation

Impact

Methods of evaluation

Questionnaires were sent to 32 students who had experienced the previous exam format to enable comparison.  Response rate was 30%, 70% from students of Italian. Responses were consistent across the two languages.

8 Teachers provided verbal or written feedback.

 Students’ Questionnaire Results

Overall students’ feedback was positive.  Students recognised closer alignment between teaching and assessment, and that talking to another student was more natural. They also reported increased opportunities to practise and felt well prepared.

However, they did not feel that the new format improved their opportunity to demonstrate their learning or speaking to a student more relaxing.  The qualitative feedback tells us that this is due to anxieties around pairings.

Teachers’ Feedback

  • Language production was more spontaneous and authentic. One teacher commented ‘it was a much more authentic situation and students really helped each other to communicate’
  • Marking changed from a focus on listening for errors towards rewarding successful communication
  • Workload decreased by up to 30%, for the average student cohort and peaks and troughs of work were better distributed

Reflections

Overall, the impact on both teachers and students was positive. Student reported that they were well briefed and had greater opportunities to practise before the exam. Teachers reported a positive impact on workloads and on the students’ ability to demonstrate they were able to communicate in the language.

However, this was not reflected in the students’ feedback. There is a clear discrepancy in the teachers and students’ perception of how the new format allows students to showcase learning.

Despite mitigating action being taken, students also reported anxiety around ‘interlocutor effect’.  Race (2014) tells us that even when universities have put all possible measures in place to make assessment fair they often fail to communicate this appropriately to students. The next steps should therefore focus on engaging students to bridge this perception gap.

Follow-up

Follow up was planned for the 2019-20 academic cycle but could not take place due to the COVID-19 pandemic.

References

Galaczi & French, in Taylor, L. (ed.), (2011). Examining Speaking: Research and practice in assessing second language speaking. Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo, Dehli, Tokyo, Mexico City: CUP.

Fulcher, G. (2003). Testing Second Language Speaking. Ediburgh: Pearson.

Hardi Prasetyo, A. (2018). Paired Oral Tests: A literature review. LLT Journal: A Journal on Language and Language Teaching, 21(Suppl.), 105-110.

Race, P. (2014) Making Learning happen (3rd ed.), Los Angeles; London: Sage

Race, P. (2015) The lecturer’s toolkit : a practical guide to assessment, learning and teaching (4th ed.), London ; New York, NY : Routledge, Taylor & Francis Group

 

How ISLI moved to full online teaching in four weeks

Daniela Standen, ISLI

Overview

ISLI teaches almost exclusively international students. Many of our programmes run all year round, so ISLI had to move to teach exclusively online in the Summer Term. This case study outlines the approach taken and some of the lessons learnt along the way. 

Objectives 

  • Delivering a full Pre-sessional English Programme online to 100 students.
  • Providing academic language and literacy courses for international students.
  • Teaching International Foundation students, with one cohort about to begin their second term at Reading.
  • Teaching students on the Study Abroad Programme.

Context  

In April 2020 as the country was into lockdown and most of the University had finished teaching, ISLI was about to start a ‘normal’ teaching term.  The Pre-sessional English Programme was about to welcome 100 (mostly new) students to the University. The January entry of the International Foundation Programme was less than half-way through their studies and the Academic English Programme was still providing language and academic literacy support to international students.

Implementation

Moving to online teaching was greatly facilitated by having in house TEL expertise as well as colleagues with experience of online teaching, who supported the upskilling of ISLI academic staff and were able to advise on programme, module and lesson frameworks.

We thought that collaboration would be key, so we put in place numerous channels for cross-School working to share best practice and tackle challenges.  ISLI TEL colleagues offered weekly all School Q&A sessions as well as specific TEL training. We set up a Programme Directors’ Community of Practice that meets weekly; and made full use of TEAMS as a space where resources and expertise could be shared.  Some programmes also created a ‘buddy system for teachers’.

Primarily the School adopted an asynchronous approach to teaching, synchronous delivery was made particularly difficult by having students scattered across the globe.  We used a variety of tools from videos, screencasts, narrated PowerPoints and Task & Answer documents to full Xerte lessons.  Generally using a variety of the above to build a lesson.  Interactive elements were provided initially mostly asynchronously, using discussion boards, Padlet and Flipgrid.  However, as the term progressed feedback from students highlighted a need for some synchronous delivery, which was carried out using Blackboard collaborate and TEAMS. 

Impact

It has not been easy, but there have been many positive outcomes from having had to change our working practices.  Despite the incredibly short timescales and the almost non-existent preparation timel, our PSE 3 students started and successfully finished their programme completely online, the IFP January entry students are ready to start their revision weeks before sitting their exams in July and international students writing dissertations and post graduate research were supported throughout the term.

As a School we have learnt new skills and to work in ways that we may not have thought possible had we not been forced into them.  These new ways of working have fostered cross-School collaboration and sharing of expertise and knowledge.

Reflections

We have learnt a lot in the past three months.  On average it takes a day’s work to transform one hour of face to face teaching into a task-based online lesson.

Not all TEL tools are equally effective and efficient, below are some of our favourites:

  • For delivering content: Narrated PowerPoints, Screen casts, Webinars, Task and Answer (PDF/Word Documents)
  • For building online communities: Live sessions on BB collaborate (but students are sometimes shy to take part in breakout group discussions), Flipgrip, discussion boards.
  • For student engagement: BB retention centre, Tutorials on Teams, small frequent formative assignments/tasks on Blackboard Assignments.
  • For assessment: BB assignments, Turn it in, Teams for oral assessment

If time were not a consideration Xerte would also be on the list.

Copyright issues can have a real impact on what you can do when delivering completely online.  Careful consideration also needs to be given when linking to videos, particularly if you have students that are based in China.

Follow up

ISLI is now preparing for Summer PSE, which starts at the end of June. Many of the lessons learnt this term have fed into preparation for summer and autumn teaching.  In particular, we have listened to our students, who told us clearly that face-to-face interaction even if ‘virtual’ is really important and have included more webinars and Blackboard Collaborate sessions in our programmes.

Links

https://www.reading.ac.uk/ISLI/