Online Delivery of Practical Learning Elements for SKE

Jo Anna Reed Johnson, Gaynor Bradley, Chris Turner

j.a.reedjohnson@reading.ac.uk

Overview

This article outlines the re-thinking of how to deliver the science practical elements of the subject knowledge enhancement programme (SKE) due to the impact of Covid-19 in March 2020 and a move online.  This focuses on what we learnt from the practical elements of the programme delivery online as it required students to engage in laboratory activities to develop their subject knowledge skills in Physics, Chemistry and Biology related to school national curriculum over two weeks in June and July 2021.  Whilst there are some elements of the programme we will continue to deliver online post Covid-19, there are aspects of practical work our students would still benefit from hands on experience in the laboratory, with the online resources enhancing that experience.

Objectives

  • Redesign IESKEP-19-0SUP: Subject Knowledge Enhancement Programmes so it was COVID-safe and fulfilled the programme objectives in terms of Practical work
  • Redesign how students could access the school science practical work with no access to laboratories. This relates to Required Practical work for GCSE and A’Levels
  • Ensure opportunities for discussion and collaboration when conducting practical work
  • Provide students with access to resources (posted and shopping list)

Review student perspectives related to the response to online provision

Context

In June 2020 there was no access to the science laboratories at the Institute of Education (L10) due to the Covid pandemic. The Subject Knowledge Enhancement Programme (SKE) is a pre-PGCE requirement for applicants who want to train to be a teacher but may be at risk of not meeting the right level of subject knowledge (Teacher Standard 3) by the end of their PGCE year (one-year postgraduate teacher training programme).  We had 21 SKE Science (Physics, Chemistry, Biology) students on the programme, 3 academic staff, one senior technician and two practical weeks to run.  We had to think quickly and imaginatively.  With a plethora of school resources available online for school practical science we set about reviewing these and deciding what we might use to provide our students with the experience they needed.  In addition, in terms of the programme content for the practical weeks we streamlined our provision, as working online requires more time.

Implementation

In May 2020, the senior science technician was allowed access to the labs.  With a lot of work, she prepared a small resource pack with some basic equipment that was posted to each student.  This was supplemented with a shopping list that students would prepare in advance of the practical weeks.  For the practical week programme, we focused on making use of free videos that are available on YouTube and the Web (https://www.youtube.com/watch?v=jBVxo5T-ZQM

and https://www.youtube.com/watch?v=SsKVA88oG-M&list=PLAd0MSIZBSsHL8ol8E-a-xgdcyQCkGnGt&index=12).

Having been part of an online lab research project at the University of Leicester I introduced this to students for simulations along with PHET (https://www.golabz.eu/ and https://phet.colorado.edu/).

 

 

 

 

 

 

 

We also wanted students to still feel some sense of ‘doing practical work’ and set up the home labs for those topics we deemed suitable e.g. heart dissection, quadrats, making indicators.

 

 

Powerpoints were narrated or streamed.  We set up regular meetings with colleagues to meet in the morning before each practical day started.  We did live introductions to the students to outline the work to be covered that day.  In addition, we organised drop-in sessions such as mid-morning breaks and end of day reviews with the students for discussion.  Throughout, students worked in groups where they would meet, discuss questions and share insights.  This was through MS Teams meetings/channels where we were able to build communities of practice.

Impact

End of programme feedback was received via our usual emailed evaluation sheet at the end of the practical weeks and end of the programme.  5/21 responses.  The overall feedback was that students had enjoyed the collaboration and thought the programme was well structured.

I particularly enjoyed the collaborative engagement with both my colleagues and our tutors. Given that these were unusual circumstances, it was important to maintain a strong team spirit as I felt that this gave us all mechanisms to cope with those times where things were daunting, confusing etc but also it gave us all moments to share successes and achievements, all of which helped progression through the course. I felt that we had the right blend of help and support from our tutors, with good input balancing space for us to collaborate effectively.’

Student Feedback initial evaluation

‘I enjoyed “meeting” my SKE buddies and getting to know my new colleagues. I enjoyed A Level Practical Week and found some of the online tools for experimentation and demonstrating experiments very helpful’

Student Feedback initial evaluation

To supplement this feedback, as part of a small-scale scholarly activity across three programmes, we also sent out a MS Form for students to complete to allow us to gain some deeper insights into the transition to online learning.  22/100 responses.  For example, the students’ excitement of doing practical work, and the experience of using things online that they could then use in their own teaching practice:

‘…the excitement of receiving the pack of goodies through the post was real and I enjoyed that element of the program and it’s been genuinely useful. I’ve used some of those experiments that we did in the classroom and as PERSON B said virtually as well.’

Student Feedback MS Form

‘…some of the online simulators that we used in our SKE we’ve used. I certainly have used while we’ve been doing online learning last half term, like the PHET simulators and things like that…’

Student Feedback MS Form

The students who consented to take part engaged in four discussion groups (5 participants per group = 20 participants).  These took place on MS Teams and this once again highlighted the benefits of the online group engagement, as well as still being able to meet the programme objectives:

‘I just wanted to say, really. It was it was a credit to the team that delivered the SKE that it got my subject knowledge to a level where it needed to be, so I know that the people had to react quickly to deliver the program in a different way…’

Student Feedback Focus Group

There was some feedback that helped us to review and feedback into our programme development such as surprise at how much independent learning there is on the programme, and the amount of resources or other materials (e.g. exam questions).

Reflections

We adopted an integrated learning model:

 

We learnt that you do not have to reinvent the wheel.  With the plethora of tools online we just needed to be selective… asking ‘did the tool achieve the purpose or learning outcome?’.  In terms of planning and running online activities we engaged with Gilly Salmon’s (2004) 5 stage model of e-learning.   This provides structure and we would apply this to our general planning in the use of Blackboard or other TEL tools in the future.  We will continue to use the tools we used.  These are useful T&L experiences for our trainees as schools move more and more towards engagement with technology.

However, it was still thought by the students that nothing can replace actually practical work in the labs:

‘I liked the online approach to SKE but feel that lab work face-to-face should still be part of the course if possible. There are two reasons for this: skills acquisition/ familiarity with experiments and also networking and discussion with other SKE students.’                                                                                      Student Feedback MS Form

Where possible we will do practical work in the labs but supplement these with the online resources, videos and simulation applications.  We will make sure that the course structure prioritises the practical work but also incorporates aspects of online learning.

We will continue to provide collaborative opportunities and engage students online for group work and tutorials in future years.  We also found the ways in which we collaborated through communities of practice, on MS Teams Channels, was very effective.  We set up groups, who continued to work in similar ways throughout the course, who were able to share ideas by posting evidence, then engage in a discussion.  Again, this is something we will continue to do so that when our students are dispersed across the school partnership, in different locations, they can still be in touch and work on things collaboratively.

Links and References

Online Lab case Studies

https://www.golabz.eu/

https://phet.colorado.edu/

Practical week Free Videos

https://www.youtube.com/watch?v=jBVxo5T-ZQM

https://www.youtube.com/watch?v=SsKVA88oG-M&list=PLAd0MSIZBSsHL8ol8E-a-xgdcyQCkGnGt&index=12).

Running Virtual Focus Groups – Investigating the Student Experience of the Academic Tutor System

Amanda Millmore, School of Law

Overview

I wanted to measure the impact of the new Academic Tutor System (ATS) on the students in the School of Law, and capture their experiences, both good and bad, with a view to making improvements. I successfully bid for some small project funding from the ATS Steering Group prior to Covid-19. The obvious difficulty I faced in the lockdown, was how to engage my students and encourage them to get involved in a virtual project. How can students co-produce resources when they are spread around the world?

Objectives

I planned to run focus groups with current students with dual aims:·

  • To find out more about their experiences of the academic tutor system and how we might better meet their needs; and
  • To see if the students could collaboratively develop some resources advising their fellow students how to get the most out of tutoring.

The overall aim being to raise the profile of academic tutoring within the School and the positive benefits it offers to our students, but also to troubleshoot any issues.

Implementation

After exams, I emailed all students in the School of Law to ask them to complete an anonymous online survey about their experiences. Around 10% of the cohort responded.

 

 

 

 

 

 

 

Within that survey I offered students the opportunity to join virtual focus groups. The funding had originally been targeted at providing refreshments as an enticement to get involved, so I redeployed it to offer payment via Campus Jobs for the students’ time (a remarkably easy process). I was conscious that many of our students had lost their part time employment, and it seemed a positive way to help them and encourage involvement. I had 56 volunteers, and randomly selected names, ensuring that I had representation from all year groups.

I ran 2 focus groups virtually using MS Teams, each containing students from different years. This seemed to work well for the 11 students who were all able to join the sessions and recording the sessions online enabled me to obtain a more accurate note which was particularly helpful. I was pleasantly surprised at how the conversation flowed virtually; with no more than 6 students in a group we kept all microphones on, to allow everyone to speak, and I facilitated with some prompts and encouraging quieter participants to offer their opinions.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The students were very forthcoming with advice and their honest experiences. They were clear that a good tutor relationship can make a real and noticeable difference for students and those who had had good experiences were effusive in their praise. They were keen to help me find ways to improve the system for everyone.

Results

The students collaborated to produce their “Top Tips for Getting the Most Out of Your Academic Tutor” which we have created into a postcard to share with new undergraduates, using free design software Canva https://www.canva.com/.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The students also individually made short videos at home of their own top tips, and emailed them to me; I enlisted my teenage son to edit those into 2 short videos, one aimed at postgraduates, one for undergraduates, which I can use as part of induction.

From the project I now have useful data as to how our students use their academic tutor. A thematic analysis of qualitative comments from the questionnaires and focus groups identified 5 key themes:

  • Tutor availability
  • Communication by the tutor
  • School level communication
  • Content of meetings
  • Staffing

From these themes I have drawn up a detailed action plan to be implemented to deal with student concerns.

Impact & Reflections

One of the main messages was that we need to do better at clearly communicating the role of the academic tutor to students and staff.

The students’ advice videos are low-tech but high impact, all recorded in lockdown on their phones from around the world, sharing what they wish they’d known and advising their fellow Law students how to maximise the tutor/tutee relationship. The videos have been shared with the STAR mentor team, the ATS Steering Group and the MCE team now has the original footage, to see if they can be used University-wide.

I am firmly convinced that students are more likely to listen to advice from their peers than from an academic, so am hopeful that the advice postcards and videos will help, particularly if we have a more virtual induction process in the forthcoming academic year.

Ultimately, whilst not the project I initially envisaged, our virtual focus group alternative worked well for my student partners, and they were still able to co-create resources, in a more innovative format than I anticipated. My message to colleagues is to trust the students to know what will work for their fellow students, and don’t be afraid to try something new.

 

Introducing group assessment to improve constructive alignment: impact on teacher and student

Daniela Standen, School Director of Teaching and Learning, ISLI  Alison Nicholson, Honorary Fellow, UoR

Overview

In summer 2018-19 Italian and French in Institution-wide Language Programme, piloted paired Oral exams. The impact of the change is explored below. Although discussed in the context of language assessment, the drivers for change, challenges and outcomes are relevant to any discipline intending to introduce more authentic and collaborative tasks in their assessment mix. Group assessments constitute around 4% of the University Assessment types (EMA data, academic year 2019-20).

Objectives

  • improve constructive alignment between the learning outcomes, the teaching methodology and the assessment process
  • for students to be more relaxed and produce more authentic and spontaneous language
  • make the assessment process more efficient, with the aim to reduce teacher workload

Context

IWLP provides credit-bearing language learning opportunities for students across the University. Around 1,000 students learn a language with IWLP at Reading.

The learning outcomes of the modules talk about the ability to communicate in the language.  The teaching methodology employed favours student–student interaction and collaboration.  In class, students work mostly in pairs or small groups. The exam format, on the other hand, was structured so that a student would interact with the teacher.

The exam was often the first time students would have spoken one-to-one with the teacher. The change in interaction pattern could be intimidating and tended to produce stilted Q&A sessions or interrogations, not communication.

Implementation

Who was affected by the change?

221 Students

8 Teachers

7 Modules

4 Proficiency Levels

2 Languages

What changed?

  • The interlocution pattern changed from teacher-student to student-student, reflecting the normal pattern of in-class interaction
  • The marking criteria changed, so that quality of interaction was better defined and carried higher weight
  • The marking process changed, teachers as well as students were paired. Instead of the examiner re-listening to all the oral exams in order to award a mark, the exams were double staffed. One teacher concentrated on running the exam and marking using holistic marking criteria and the second teacher listened and marked using analytic rating scales

Expected outcomes

  • Students to be more relaxed and produce more authentic and spontaneous language
  • Students to student interaction creates a more relaxed atmosphere
  • Students take longer speaking turns
  • Students use more features of interaction

(Hardi Prasetyo, 2018)

  • For there to be perceived issues of validity and fairness around ‘interlocutor effects’ i.e. how does the competence of the person I am speaking to affect my outcomes. (Galaczi & French, 2011)

 Mitigation

  • Homogeneous pairings, through class observation
  • Include monologic and dialogic assessment tasks
  • Planned teacher intervention
  • Inclusion of communicative and linguistic marking criteria
  • Pairing teachers as well as students, for more robust moderation

Impact

Methods of evaluation

Questionnaires were sent to 32 students who had experienced the previous exam format to enable comparison.  Response rate was 30%, 70% from students of Italian. Responses were consistent across the two languages.

8 Teachers provided verbal or written feedback.

 Students’ Questionnaire Results

Overall students’ feedback was positive.  Students recognised closer alignment between teaching and assessment, and that talking to another student was more natural. They also reported increased opportunities to practise and felt well prepared.

However, they did not feel that the new format improved their opportunity to demonstrate their learning or speaking to a student more relaxing.  The qualitative feedback tells us that this is due to anxieties around pairings.

Teachers’ Feedback

  • Language production was more spontaneous and authentic. One teacher commented ‘it was a much more authentic situation and students really helped each other to communicate’
  • Marking changed from a focus on listening for errors towards rewarding successful communication
  • Workload decreased by up to 30%, for the average student cohort and peaks and troughs of work were better distributed

Reflections

Overall, the impact on both teachers and students was positive. Student reported that they were well briefed and had greater opportunities to practise before the exam. Teachers reported a positive impact on workloads and on the students’ ability to demonstrate they were able to communicate in the language.

However, this was not reflected in the students’ feedback. There is a clear discrepancy in the teachers and students’ perception of how the new format allows students to showcase learning.

Despite mitigating action being taken, students also reported anxiety around ‘interlocutor effect’.  Race (2014) tells us that even when universities have put all possible measures in place to make assessment fair they often fail to communicate this appropriately to students. The next steps should therefore focus on engaging students to bridge this perception gap.

Follow-up

Follow up was planned for the 2019-20 academic cycle but could not take place due to the COVID-19 pandemic.

References

Galaczi & French, in Taylor, L. (ed.), (2011). Examining Speaking: Research and practice in assessing second language speaking. Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo, Dehli, Tokyo, Mexico City: CUP.

Fulcher, G. (2003). Testing Second Language Speaking. Ediburgh: Pearson.

Hardi Prasetyo, A. (2018). Paired Oral Tests: A literature review. LLT Journal: A Journal on Language and Language Teaching, 21(Suppl.), 105-110.

Race, P. (2014) Making Learning happen (3rd ed.), Los Angeles; London: Sage

Race, P. (2015) The lecturer’s toolkit : a practical guide to assessment, learning and teaching (4th ed.), London ; New York, NY : Routledge, Taylor & Francis Group

 

Supporting Transition: Investigating students’ experiences of transferring from University of Reading Malaysia campus (UoRM) to the University of Reading UK campus (UoR)

Daniel Grant, Associate Professor in Clinical Pharmacy & Pharmacy Education, Pharmacy Director of Teaching & Learning & Dr Taniya Sharmeen Research Fellow

 

Click here to read the full report.

 

This slide summarises the project:

 

 

 

How ISLI’s Assessment Team created an online oral exam for the Test of English for Educational Purposes (TEEP)

Fiona Orel– International Study and Language Institute (ISLI)

 

Overview

ISLI’s Test of English for Educational Purposes (TEEP) is administered at the end of pre-sessional courses as a measure of students’ academic English proficiency. The speaking test has traditionally been an academic discussion between two students that is facilitated by an interlocutor and marked by an observer.

This case study outlines the process of creating a version of the TEEP speaking test for 1-1 online delivery.

Objectives

  • To create an online TEEP speaking test that could be administered at the beginning of June to 95 students
  • To ensure reliability and security of results
  • To support students and teachers with the transition

Context

The Pre-sessional English course 3 (PSE 3) started in April during the period of lockdown.  At the end of the course all students sit a TEEP test which includes a test of speaking skills. We realised that we wouldn’t be able to administer the usual two student + two teachers test given the constraints with technology and the changes in teaching and learning which reduced to a certain degree the students’ opportunities for oral interaction and that we would need to develop a new 1-1 test that maintained the validity and reliability of the original TEEP Speaking test.

Implementation

We had two main objectives: to create a valid online 1-1 speaking test, and to make sure that the technology we used to administer the test was simple and straight-forward for both teachers and students, and would have reasonably reliable connectivity in the regions where students were based (China, Middle East and UK).

The first thing we needed to do was to return to our test specifications – what exactly were we hoping to assess through the oral exam? The original face-to-face test had five criteria: overall communication, interaction, fluency, accuracy and range, and intelligibility. We knew that interaction had been impacted by the move online, but decided that the aspect of responding appropriately to others was a crucial aspect of interaction that needed to remain and included this in the ‘overall communication’ criteria. Recognising also that interlocutors would also need to be examiners, we worked on streamlining the criteria to remove redundancy and repetition and to ensure that each block contained the same type of description in the same order thereby making it easier for tutors to skim and recall.

We then worked out exactly what functions and skills in speaking that we wanted to test and how we could do that while mostly working with existing resources. We aligned with the original test specifications by testing students’ ability to:

  • Provide appropriate responses to questions and prompt
  • Describe experiences and things
  • Give and justify an opinion by, for example, stating an opinion, explaining causes and effects, comparing, evaluating.

The test format that enabled this was:

  • Part one: an interview with the student about their studies and experience of studying online
  • Part two: problem solving scenario: Students are introduced to a problem which the teacher screen shares with them and they are given three possible solutions to compare, evaluate and rank most to least effective
  • Part three: abstract discussion building on the talk given in part two

The final stage was trialling a platform to conduct the tests. We had considered Zoom due to its reliability but discounted it due to security concerns. BB Collaborate had connectivity issues in China so we decided to use Teams as connectivity was generally better and students and teachers were familiar with the platform as they had been using it for tutorials. Due to the spread of students over time zones, we decided to spread the speaking tests over three mornings finishing by 11:00 BST on each day. We kept the final slot on Friday free for all teachers to enable rescheduling of tests for any student experiencing issues with connectivity on the day.

Finally, we needed to help teachers and students prepare for the tests. For students, learning materials were produced with videos of a sample test, there was a well-attended webinar to introduce the format and requirements, and the recording of this webinar was made available to all students along with a document on their BB course. This instructed them what to do before test day and what to expect on test day.

The test format and procedures were introduced to teachers with instructions for tasks to do before the test, during the test, and after the test. There was also an examiner’s script prepared with integrated instructions and speech to standardise how the tests were administered. Each test was recorded to ensure security and to enable moderation. All students had to verify their identity at the start of the test. The test recording caused some problems as we knew that the video would have to be downloaded and deleted from Stream before anyone else or the student in the Team meeting who had been tested could access it. For this reason we allowed 40 minutes for each 20 minute interview as downloading was sometimes a lengthy process depending on internet speeds. We had 2 or 3 people available each day to pick up any problems such as a teacher being unwell or having tech issues, and/or a student experiencing problems. This worked well and on the first two days we did have to reschedule a number of tests, fortunately, all worked well on the final day. The teachers were fully committed and worked hard to put students at ease, informal feedback from students was the appreciation of an opportunity to talk 1-1 with a tutor, and tutors said that the test format allowed for plenty of evidence upon which to base a decision.

Impact

The test was successful overall and there were fewer technical issues than we had anticipated. Teachers and students were happy with it as an assessment measure and we were able to award valid and reliable grades.

Working together collaboratively with the teachers and the Programme Director was incredibly rewarding and meant that we had a wide resource base of talent and experience when we did run into any problems.

Reflections

Incredibly detailed planning, the sharing of information across Assessment and Pre-sessional Teams, and much appreciated support from the TEL team helped to make the test a success. Students and teachers had very clear and detailed instructions and knew exactly what was expected and how the tests would be conducted. The sharing of expertise across teams meant that problems were solved quickly and creatively, and it is good to see this practice becoming the norm.

We need to work on the issue of downloading and deleting the video after each test as this caused some anxiety for some teachers with slower internet connection. We also need to have more technical support available, especially on the first day. Most students had tested their equipment as instructed but some who hadn’t experienced issues. It would be even better if a similar activity could be built into the course so that teachers and students experience the test situation before the final test.

Follow up

ISLI’s Assessment Team is now preparing to administer the same tests to a much larger cohort of students at the end of August. We will apply the lessons learned during this administration and work to make the process easier for teachers.

Flipped learning revisited

Dr Edward Tew – Lecturer in Accounting, HBS.

 

Flipped learning revisited

Due to the COVID-19 pandemic, many of us in UoR moved to ‘emergency’ remote teaching at the end of the spring term. Colleagues across the university were developing instruction using VLE platforms such as UoR Blackboard and students were studying and working online at a distance.

At the same time, we are urged to use different ways to provide meaningful online learning. In response, UoR has recently published a Teaching and Learning Framework for Autumn 2020 intending to balance the online delivery with interacting teaching. One particular point to note is that this framework is “influenced by pedagogical approaches used in flipped learning” (UoR, 2020). With this in mind, I thought of sharing with you my reflection on the Flipped learning, particularly on the framework proposed by Flipped Learning Network (FLN) (2014) and its application during this Covid-19 pandemic.

According to FLN’s (2014) definition, flipped learning is:

“a pedagogical approach in which direct instruction moves from the group learning space to the individual learning space, and the resulting group space is transformed into a dynamic, interactive learning environment where the educator guides students as they apply concepts and engage creatively in the subject matter.”

With this definition, the flipped learning /classroom is built around the four “pillars” F- flexible environment, L- learning culture, I- intentional content, and P- professional educator (Flipped Learning Network, 2014). I believe these four “pillars” can be applied in online flipped lessons which students learn either synchronously or asynchronously. To meet the demand for distance/online learning, especially given the current pandemic, I hope to share my reflection on the use of flipped learning by considering the four pillars accordingly.

 

  1. F – Flexible Environment

In this pillar, educators allow a variety of learning modes in which students choose when and where they learn either group work or independent study.

Application: Select a platform that will be the hub of your online classroom and for all your instructional activities and resources.  In my module, I stick to use the Blackboard (BB) as my core online learning platform with students. Don’t be afraid to be experimental and mix the available tools on the BB, such as wikis to upload students’ group work, presentations, and research activities. There are many other platforms available such as Google Classroom, Microsoft Teams, Webex. The key is to use what is already familiar to students so that the learning process and activities are made easier for them to navigate and participate effectively.

 

  1. L – Learning Culture

The learning culture in flipped learning has shifted the traditional teacher-centred model to a learner-centred approach. As a result, students are actively involved in knowledge construction as they participate in and evaluate their learning in a personally meaningful manner.

Application: Once you’ve chosen your platform, decide how you will organise learning to encourage a learner-centred learning culture. To do this, communication is the key. In my module, I made sure the module was easy to navigate. I made sure the learning aims and objectives are clearly stated so students can see what they were learning for each topic/ lesson. I also made use of the BB’s module page with folders created to indicate my teaching materials, learning activities and presentations etc. Next, I tried to encourage collaboration in learning using tools such as discussion boards, wikis, blogs or online meetings (i.e. WhatsApp, Team, Zoom). These would provide user-friendly spaces to get my students to work collaboratively and sharing ideas. With all these tools, I have constantly made it clear that students were expected to do their learning first before coming to meet together for critical discussion and interaction.

 

  1. I- Intentional Content

In this pillar, educators decide what they need to teach and what materials students should explore on their own. This pillar aims to maximise classroom time to encourage student-centred learning as considered in pillar 2 above.

Application: Intentional learning occurs when we purposefully select and deliver the content to actively engage students in learning. In my module, I always leave a ‘gap’ for students to further explore and research the subject topic in a group or individually. I used the discussion board and wikis to see their collaborative work and research on the subject matter. I also made sure they could apply what they have learned in an assigned case study. I intentionally used the assessment strategies that test students’ ability to conduct their research and critical thinking. Textbook’s resources and library learning resources are particularly useful in this respect.

 

  1. P – Professional Educator

FLN (2014, p.2) proposes that “Professional Educators are reflective in their practice, connect with each other to improve their instruction, accept constructive criticism, and tolerate controlled chaos in their classrooms. While Professional Educators take on less visibly prominent roles in a flipped classroom, they remain the essential ingredient that enables Flipped Learning to occur.”

Application: The teacher’s role in an online flipped classroom is to facilitate learning as in a physical classroom. In this case, it means it is essential to be available to your students virtually, providing instructional support and feedback. In my module, I monitored the online discussion board and provided feedback promptly.  I also used BB Collaborate to have 1-2-1 and group sessions with students to moderate students’ learning progress and provide constructive feedback.

This is a challenging time for numerous reasons particularly the anxiety of the unknown surrounding the virus. Given the current situation under the COVID-19 crisis, implementing online flipped learning/classroom makes sense so students do not fall behind in their learning. However, barriers and challenges to develop an effective one must be acknowledged. In this view, I resonate with FLN’s (2014) quote for Professional educators above as I need to be agile and robust to improve my instructional strategies, accept constructive criticism, and welcome controlled chaos in this online flipped classrooms. We must adapt, change quickly and moving forward effectively to counter the challenges in this unprecedented time.

 

References:

Flipped Learning Network (FLN). (2014) The Four Pillars of F-L-I-P™ , Reproducible PDF can be found at www.flippedlearning.org/definition.

UoR (2020) Teaching and Learning: Framework for Autumn term 2020, available at: https://www.reading.ac.uk/web/files/leadershipgroup/autumn-teaching-proposal-v11.pdf

 

 

The DEL Feedback Action Plan

Madeleine Davies, Cindy Becker and Michael Lyons- SLL

Overview

A feedback audit and consultation with the Student Impact Network revealed a set of practices DEL needs to amend. The research produced new student-facing physical and online posters, designed by a ‘Real Jobs’ student, to instruct students on finding their feedback online, and generated ‘marking checklists’ for staff to indicate what needs to be included in feedback and what needs to be avoided.

Objectives

  • To assess why students scored DEL poorly on feedback in NSS returns
  • To consult with students on types of feedback they considered useful
  • To brief colleagues on good practice feedback
  • To produce consistency (but not conformity) in terms of, for example, the amount of feedback provided, feedforward, full feedback for First Class work, etc.
  • To assess whether marking rubrics would help or hinder DEL feedback practice

Context

The ‘DEL Feedback Action Project’ addresses the persistent issue of depressed NSS responses to Department of English Literature assessment and feedback practices. The responses to questions in ‘teaching quality’ sections are favourable but the 2018 NSS revealed that, for English Studies, Reading is in the third quartile for the ’Assessment and Feedback’ section and the bottom quartile for question 8 (scoring 64% vs the 74% median score) and question 9 (scoring 70% vs the 77% median score).

In October 2018, DEL adopted eSFG. An EMA student survey undertaken in January 2019 polled 100 DEL students and found that, though students overwhelmingly supported the move to eSFG, complaints about the quality of DEL feedback persisted.

Implementation

Michael Lyons began the project with an audit of DEL feedback and identified a number of areas where the tone or content of feedback may need improving. This material was taken to the Student Impact Network which was shown anonymised samples of feedback. Students commented on it. This produced a set of indicators which became the basis of the ‘marking checklist’ for DEL staff. Simultaneously, DEL staff were asked to discuss feedback practice in ‘professional conversations’ for the annual Peer Review exercise. This ensured that the combined minds of the whole department were reflecting on this issue

Student consultation also revealed that many students struggle to find their feedback online. With this in mind, we collaborated with TEL to produce ‘maps to finding feedback’ for students. A ‘Real Jobs’ student designer converted this information into clear, readable posters which can be displayed online or anywhere in the University (the information is not DEL-specific). The posters will be of particular use for incoming students but our research also suggested that Part 3 students are often unaware of how to access feedback.

The results of the initial audit and consultation with students indicated where our feedback had been falling short. We wrote a summary of these finding for DEL HoD and DDTL.

Research into marking rubrics revealed that DEL marking would not be suited to using this feedback practice. This is because they can be inflexible and because DEL students resist ‘generic’ feedback.

Impact

The student-facing posters and staff-facing ‘marking checklist’ speak to two of the main issues with DEL feedback that were indicated by students. The latter will deter overly-brief, curt feedback and will prompt more feedforward and comment about specific areas of the essay (for example, the Introductory passage, the essay structure, referencing, grammar, use of secondary resources, etc).

With DEL staff now focused on the feedback issue, and with students equipped to access their feedback successfully, we are hoping to see a marked improvement in NSS scores in this area in 2020-21.

For ‘surprises’, see ‘Reflections’.

Reflections

The pressure on academic staff to mark significant amounts of work within tight deadlines can lead to potential unevenness in feedback. We are hoping that our research prompts DEL to streamline its assessment practice to enhance the quality and consistency of feedback and feedforward.

Students’ responses in the Student Impact Network also suggested that additional work is required on teaching students how to receive feedback. Over-sensitivity in some areas can produce negative scores. With this in mind, the project will terminate with an equivalent to the ‘marking checklist’ designed for students. This will remind students that feedback is anonymous, objective, and intended to pave the way to success.

Follow up

Monitoring NSS DEL feedback scores in the 2020-21 round, and polling students in the next session to ensure that they are now able to access their feedback.

Continuing to reflect on colleagues’ marking workload and the link between this and unconstructive feedback.

 

 

Student co-creation of course material in Contract Law

Dr Rachel Horton, School of Law

Overview

The PLaNT project involved the co-creation, with students, of a series and podcasts and other materials for Contract Law (LW1CON). Student leaders consulted with their peers to decide what materials students felt would most enhance learning on the module and then created these together with the Module Convenor.

Objectives

This project aimed to engage current law students as co-creators of course learning material.

Context

Contract Law is a large compulsory first year module – in an average year between 250 and 300 students take the module –  taught using a traditional combination of lectures and small group teaching. Module staff were keen to develop additional resources for students to access, in their own time, through Blackboard and wanted to engage students in developing these.

Implementation

Staff met with selected students to introduce a student curated Blackboard space, in which the students had authoring permissions to generate podcast feeds, which would be accessible to all students enrolled on the module.  These students were then asked to consult with their peers to generate ideas for use of the space/topics for the podcasts.

The student leaders then created a series of podcasts, largely focusing on revision materials and assessment and exam technique by interviewing lecturers on the module. The students also devised and created a series of written materials, in a variety of formats, and lecturers provided feedback on these (chiefly to ensure accuracy) before they were uploaded onto Blackboard.

Impact

The student leaders were highly engaged and enthusiastic and went well beyond their original remit in devising course content. They fed back, informally, that they had found the experience immensely beneficial to their own learning, as well as giving them the opportunity to develop a range of leadership, technical and communication skills.

Statistics on Blackboard showed that the materials were well used by the rest of the cohort, particularly in the immediate run up to the exams. While it proved difficult to recruit students for a focus group after the project had finished, in order to gain more structured feedback, student representatives commented at the Staff Student Liaison Committee that they had received very positive feedback from students about the additional materials created through the project.

Reflections

The success of the activity was largely a result of the enthusiasm, imagination and commitment of the students involved. We were lucky to recruit students who were able to work very well together, and with their peers, to create resources to genuinely enhance learning, and to fill gaps in course materials that may otherwise have gone unnoticed by staff.

The project also offered an opportunity for the teaching staff on the module to reflect on the content and format of materials students want. Even after the funded project has finished this proved very helpful in enabling us to continue to produce similar materials, particularly once teaching had to move online in the wake of COVID-19.

The project and funding began in the Spring term and with hindsight it would have been beneficial to start the project earlier in the course. In particular this would have provided opportunity for gathering more structured feedback from the whole cohort (it was difficult to secure a meaningful student response to feedback once the summer exams were over.)

Follow up

The materials produced by the students remain relevant for future cohorts and will continue to be made available. New materials will be developed along similar lines, with student input wherever possible, particularly next year as lectures move wholly online.

How ISLI moved to full online teaching in four weeks

Daniela Standen, ISLI

Overview

ISLI teaches almost exclusively international students. Many of our programmes run all year round, so ISLI had to move to teach exclusively online in the Summer Term. This case study outlines the approach taken and some of the lessons learnt along the way. 

Objectives 

  • Delivering a full Pre-sessional English Programme online to 100 students.
  • Providing academic language and literacy courses for international students.
  • Teaching International Foundation students, with one cohort about to begin their second term at Reading.
  • Teaching students on the Study Abroad Programme.

Context  

In April 2020 as the country was into lockdown and most of the University had finished teaching, ISLI was about to start a ‘normal’ teaching term.  The Pre-sessional English Programme was about to welcome 100 (mostly new) students to the University. The January entry of the International Foundation Programme was less than half-way through their studies and the Academic English Programme was still providing language and academic literacy support to international students.

Implementation

Moving to online teaching was greatly facilitated by having in house TEL expertise as well as colleagues with experience of online teaching, who supported the upskilling of ISLI academic staff and were able to advise on programme, module and lesson frameworks.

We thought that collaboration would be key, so we put in place numerous channels for cross-School working to share best practice and tackle challenges.  ISLI TEL colleagues offered weekly all School Q&A sessions as well as specific TEL training. We set up a Programme Directors’ Community of Practice that meets weekly; and made full use of TEAMS as a space where resources and expertise could be shared.  Some programmes also created a ‘buddy system for teachers’.

Primarily the School adopted an asynchronous approach to teaching, synchronous delivery was made particularly difficult by having students scattered across the globe.  We used a variety of tools from videos, screencasts, narrated PowerPoints and Task & Answer documents to full Xerte lessons.  Generally using a variety of the above to build a lesson.  Interactive elements were provided initially mostly asynchronously, using discussion boards, Padlet and Flipgrid.  However, as the term progressed feedback from students highlighted a need for some synchronous delivery, which was carried out using Blackboard collaborate and TEAMS. 

Impact

It has not been easy, but there have been many positive outcomes from having had to change our working practices.  Despite the incredibly short timescales and the almost non-existent preparation timel, our PSE 3 students started and successfully finished their programme completely online, the IFP January entry students are ready to start their revision weeks before sitting their exams in July and international students writing dissertations and post graduate research were supported throughout the term.

As a School we have learnt new skills and to work in ways that we may not have thought possible had we not been forced into them.  These new ways of working have fostered cross-School collaboration and sharing of expertise and knowledge.

Reflections

We have learnt a lot in the past three months.  On average it takes a day’s work to transform one hour of face to face teaching into a task-based online lesson.

Not all TEL tools are equally effective and efficient, below are some of our favourites:

  • For delivering content: Narrated PowerPoints, Screen casts, Webinars, Task and Answer (PDF/Word Documents)
  • For building online communities: Live sessions on BB collaborate (but students are sometimes shy to take part in breakout group discussions), Flipgrip, discussion boards.
  • For student engagement: BB retention centre, Tutorials on Teams, small frequent formative assignments/tasks on Blackboard Assignments.
  • For assessment: BB assignments, Turn it in, Teams for oral assessment

If time were not a consideration Xerte would also be on the list.

Copyright issues can have a real impact on what you can do when delivering completely online.  Careful consideration also needs to be given when linking to videos, particularly if you have students that are based in China.

Follow up

ISLI is now preparing for Summer PSE, which starts at the end of June. Many of the lessons learnt this term have fed into preparation for summer and autumn teaching.  In particular, we have listened to our students, who told us clearly that face-to-face interaction even if ‘virtual’ is really important and have included more webinars and Blackboard Collaborate sessions in our programmes.

Links

https://www.reading.ac.uk/ISLI/