Xerte: engaging asynchronous learning tasks with automated feedback

Overview

Jonathan Smith: j.p.smith@reading.ac.uk

 

International Study and Language Institute (ISLI)

This article describes a response to the need to convert paper-based learning materials, designed for use on our predominantly face-to-face summer Pre-sessional programme, to an online format, so that students could work independently and receive automated feedback on tasks, where that is appropriate. A rationale for our approach is given, followed by a discussion of the challenges we faced, the solutions we found and reflections on what we learned from the process.

Objectives

The objectives of the project were broadly to;

  • rethink ways in which learning content and tasks could be presented to students in online learning formats
  • convert paper-based learning materials intended for 80 – 120 hours of learning to online formats
  • make the new online content available to students through Blackboard and monitor usage
  • elicit feedback from students and teaching staff on the impacts of the online content on learning.

It must be emphasized that due to the need to develop a fully online course in approximately 8 weeks, we focused mainly on the first 3 of these objectives.

Context

The move from a predominantly face-to-face summer Pre-sessional programme, with 20 hours/week contact time and some blended-learning elements, to fully-online provision in Summer 2020 presented both threats and opportunities to ISLI.  We realised very early on that it would not be prudent to attempt 20 hours/week of live online teaching and learning, particularly since most of that teaching would be provided by sessional staff, working from home, with some working from outside the UK, where it would be difficult to provide IT support. In addition, almost all students would be working from outside the UK, and we knew there would be connectivity issues that would impact on the effectiveness of live online sessions.  In the end, there were 4 – 5 hours/week of live online classes, which meant that a lot of the core course content had to be covered asynchronously, with students working independently.

We had been working with Xerte, an open-source tool for authoring online learning materials, for about 3 years, creating independent study materials for consolidation and extension of learning based round print materials.  This was an opportunity to put engaging, interactive online learning materials at the heart of the programme.  Here are some of the reasons why we chose Xerte;

  • It allows for inputs (text, audio, video, images), interactive tasks and feedback to be co-located on the same webpage
  • There is a very wide range of interactive task types, including drag-and-drop ordering, categorising and matching tasks, and “hotspot” tasks in which clicking on part of a text or image produces customisable responses.
  • It offers considerable flexibility in planning navigation through learning materials, and in the ways feedback can be presented to learners.
  • Learning materials could be created by academic staff without the need for much training or support.

Xerte was only one of the tools for asynchronous learning that we used on the programme.  We also used stand-alone videos, Discussion Board tasks in Blackboard, asynchronous speaking skills tasks in Flipgrid, and written tasks submitted for formative or summative feedback through Turnitin.  We also included a relatively small number of tasks presented as Word docs or PDFs, with a self-check answer key.

Implementation

We knew that we only had time to convert the paper-based learning materials into an online format, rather than start with a blank canvas, but it very quickly became clear that the highly interactive classroom methodology underlying the paper-based materials would be difficult to translate into a fully-online format with greater emphasis on asynchronous learning and automated feedback.  As much as possible we took a flipped learning approach to maximise efficient use of time in live lessons, but it meant that a lot of content that would normally have been covered in a live lesson had to be repackaged for asynchronous learning.

In April 2020, when we started to plan the fully-online programme, we had a limited number of staff who were able to author in Xerte.  Fortunately, we had developed a self-access training resource which meant that new authors were able to learn how to author with minimal support from ISLI’s TEL staff. A number of sessional staff with experience in online teaching or materials development were redeployed from teaching on the summer programme to materials development.  We provided them with a lot of support in the early stages of materials development; providing models and templates, storyboarding, reviewing drafts together. We also produced a style guide so that we had consistent formatting conventions and presentation standards.

The Xerte lessons were accessed via links in Blackboard, and in the end-of-course evaluations we asked students and teaching staff a number of open and closed questions about their response to Xerte.

Impact

We were not in a position to assess the impact of the Xerte lessons on learning outcomes, as we were unable to differentiate between this and the impacts of other aspects of the programme (e.g. live lessons, teacher feedback on written work).  Students are assessed on the basis of a combination of coursework and formal examinations (discussed by Fiona Orel in other posts to the T&L Exchange), and overall grades at different levels of performance were broadly in line with those in previous years, when the online component of the programme was minimal.

In the end-of-course evaluation, students were asked “How useful did you find the Xerte lessons in helping you improve your skills and knowledge?” 245 students responded to this question: 137 (56%) answered “Very useful”, 105 (43%)  “Quite useful” and 3 (1%) “Not useful”.  The open questions provided a lot of useful information that we are taking into account in revising the programme for 2021.  There were technical issues round playing video for some students, and bugs in some of the tasks; most of these issues were resolved after they were flagged up by students during the course. In other comments, students said that feedback needed to be improved for some tasks, that some of the Xerte lessons were too long, and that we needed to develop a way in which students could quickly return to specific Xerte lessons for review later in the course.

Reflections

We learned a lot, very quickly, about instructional design for online learning.

Instructions for asynchronous online tasks need to be very explicit and unambiguous, because at the time students are using Xerte lessons they are not in a position to check their understanding either with peers or with tutors.  We produced a video and a Xerte lesson aimed at helping students understand how to work with Xerte lessons to exploit their maximum potential for learning.

The same applies to feedback.  In addition, to have value, automated feedback generally (but not always) needs to be detailed, with explanations why specific answers are correct or wrong.  We found, occasionally, that short videos embedded in the feedback were more effective than written feedback.

Theoretically, Xerte will track individual student use and performance, if uploaded as SCORM packages into Blackboard, with grades feeding into Grade Centre.  In practice, this only works well for a limited range of task types.  The most effective way to track engagement was to follow up on Xerte lessons with short Blackboard tests.  This is not an ideal solution, and we are looking at other tracking options (e.g. xAPI).

Over the 4 years we have been working with Xerte, we had occasionally heard suggestions that Xerte was too complex for academics to learn to use.   This emphatically was not our experience over Summer 2020.  A number of new authors were able to develop pedagogically-sound Xerte lessons, using a range of task types, to high presentation standards, with almost no 1-to-1 support from the ISLI TEL team.  We estimate that, on average, new authors need to spend 5 hours learning how to use Xerte before they are able to develop materials at an efficient speed, with minimal support.

Another suggestion was that developing engaging interactive learning materials in Xerte is so time-consuming that it is not cost-effective.  It is time-consuming, but put in a situation in which we felt we had no alternative, we managed to achieve all we set out to achieve.  Covid and the need to develop a fully-online course under pressure of time really focused our minds.  The Xerte lessons will need reviewing, and some will definitely need revision, but we face summer 2021 in a far more resilient, sustainable position than at this time last year.  We learned that it makes sense to plan for a minimum 5-year shelf life for online learning materials, with regular review and updating.

Finally, converting the paper-based materials for online learning forced us to critically assess them in forensic detail, particularly in the ways students would work with those materials.   In the end we did create some new content, particularly in response to changes in the ways that students work online or use technological tools on degree programmes.

Follow up

We are now revising the Xerte lessons, on the basis of what we learned from authoring in Xerte, and the feedback we received from colleagues and students.  In particular, we are working on;

  • ways to better track student usage and performance
  • ways to better integrate learning in Xerte lessons with tasks in live lessons
  • improvements to feedback.

For further information, or if you would like to try out Xerte as an author and with students, please contact j.p.smith@reading.ac.uk, and we can set up a trial account for you on the ISLI installation. If you are already authoring with Xerte, you can also join the UoR Xerte community by asking to be added to the Xerte Users Team.

Links and References

The ISLI authoring website provides advice on instructional design with Xerte, user guides on a range of page types, and showcases a range of Xerte lessons.

The international Xerte community website provides Xerte downloads, news on updates and other developments, and a forum for discussion and advice.

Finally, authored in Xerte, this website provides the most comprehensive showcase of all the different page type available in Xerte, showing its potential functionality across a broad range of disciplines.

Learning to Interpret and Assess Complex and Incomplete Environmental Data

Andrew Wade a.j.wade@reading.ac.uk

Department of Geography and Environmental Sciences

Overview

Field work is well known to improve student confidence and enhance skills and knowledge, yet there is evidence for a decline in field work in Secondary Education, especially amongst A-level Geography students. This is problematic as students are entering Geography and Environmental Science degree programmes with reduced skills and confidence around field-based data collection and interpretation, and this appears to be leading to an apprehension around data collection for dissertations. A simple field-based practical where 47 Part 2 Geography and Environmental Science students tested their own hypotheses about factors that control water infiltration into soils was developed. Improved confidence and appreciation of critical thinking around environmental data was reported in a survey of the student experience. Student coursework demonstrated that attainment was very good, and that skills and critical thinking can be recovered and enhanced with relatively simple, low-cost field-based practical classes that can be readily embedded to scaffold subsequent modules, including the dissertation.

Context

The importance of field work is well established in Geography and Environmental Science as a means of active and peer-to-peer learning. However, students appear to have little confidence in designing their own field work for hypotheses testing when they arrive for Part 1, probably due to a decline in field work in Secondary Education (Kinder 2016, Lambert and Reiss 2014). Within the Geography and Environmental Science programmes, there is a part two, 20 credit ‘Research Training’ module that develops the same skills. However, this research training module and the dissertation are seen by the students as being of high risk in that they perceive a low mark will have a significant negative impact on the overall degree classification. Consequently, students are seemingly risk adverse around field-based projects. The idea here is to make field-based training more commonplace throughout multiple modules through inclusion of relatively simple practical training, so that hypotheses testing, critical thinking and confidence with ‘messy’ environmental data become intuitive and students are at ease with these concepts. In parallel, GES module cohorts have increased in recent years and this is an additional reason to develop simple, low-cost practical classes.

Objectives

The aim of the project was to determine if a simple, field-based practical would help boost student confidence around field data collection and interpretation, and hypotheses testing. The objective was to give the students a safe and supportive environment in which to develop their own hypotheses and method for field data collection, and to learn to interpret often ‘messy’ and ‘complex’ environmental data.

Figure 1: The practical class took place on the hill-slope on campus between the Atmospheric Observatory and Whiteknights Lake on the 28 October 2019 over 4 hours in total.

 

Figure 2: Students used a Decagon Devices Mini-Disc Infiltrometer to measure unsaturated hydraulic conductivity to test their own hypotheses about the factors controlling infiltration

Implementation

A practical was designed where 47 Part 2 students, working in groups of four or five, developed their own hypotheses around the factors controlling rainfall infiltration on a hill-slope in the class room following an in-class briefing, and then tested these hypotheses in the field using Mini Disc infiltrometers (Figs. 1, 2 and 3). There was a further follow-up session where each student spent two hours processing the data collected and was briefed on the coursework write-up.

Figure 3: The students tested hypotheses around distance from the lake, vegetation and soil type, soil moisture and soil compaction. Each student group spent two hours in the field.

Impact

Of 40 students who responded to an on-line survey:

  • 37 agreed the practical helped develop their critical thinking skills around complex and incomplete environmental data;
  • 36 agreed they were now better able to deal with uncertainty in field-based measurements;
    and 38 feel more confident working in the field.

Student quotes included:

  • “The practical was very useful in helping to understand the processes happening as well as being more confident in using the equipment.”
  • “I thought the practical was good as it was another way to process information which tends to work better for me, doing and seeing how it works allows me to gain a higher understanding in the processes”

The majority of students gained first class and upper second-class marks for the project write-up and the reports submitted demonstrated good critical thinking skills in the interpretation of the infiltration measurements. There has been a noticeable increase in the number of students opting for hydrology-based dissertations.

Reflections

Confidence and critical thinking skills can be enhanced with relatively simple, low-cost field-based practicals that scaffold subsequent modules including Research Training for Geographers and Environmental Science, and the dissertation, and focus on hypotheses testing in addition to knowledge acquisition. Each student spent 2 hours in the field on campus and 2 hours processing their data, with further time on the coursework write-up. This seems a reasonable investment in time given the benefits in confidence, skills and knowledge. Embedding such practicals should not replace the larger skills-based modules, such as Research Training, nor should such practical classes replace entirely those that focus more on knowledge acquisition, but these practical classes, where students explore their own ideas, appear to be a useful means to boost student confidence and critical thinking skills at an early stage. The practical was also an excellent means of encouraging peer to peer interaction and learning, and this and similar practical classes have good potential for the integration of home and NUIST students.

Follow up

Embed similar practical classes in part one modules to build confidence at the outset of the degree programme and, at part three, to further enable integration of home and NUIST students.

Links and References

Kinder A. 2016. Geography: The future of fieldwork in schools. Online: http://www.sec-ed.co.uk/best-practice/geography-the-future-of-fieldwork-in-schools/ (Last accessed: 03 Jan 2020).

Lambert D and Reiss MJ. 2014, The place of fieldwork in geography and science qualifications, Institute of Education, University of London. ISBN: 978-1-78277-095-4. pp. 20

The impact of COVID upon practical classes in Part 1 chemistry – an opportunity to redevelop a core module

Philippa Cranwell p.b.cranwell@reading.ac.uk, Jenny Eyley, Jessica Gusthart, Kevin Lovelock and Michael Piperakis

Overview

This article outlines a re-design that was undertaken for the Part 1 autumn/spring chemistry module, CH1PRA, which services approximately 45 students per year. All students complete practical work over 20 weeks of the year. There are four blocks of five weeks of practical work in rotation (introductory, inorganic, organic and physical) and students spend one afternoon (4 hours) in the laboratory per week. The re-design was partly due to COVID, as we were forced to critically look at the experiments the students completed to ensure that the practical skills students developed during the COVID pandemic were relevant for Part 2 and beyond, and to ensure that the assessments students completed could also be stand-alone exercises if COVID prevented the completion of practical work. COVID actually provided us with an opportunity to re-invigorate the course and critically appraise whether the skills that students were developing, and how they were assessed, were still relevant for employers and later study.

Objectives

• Redesign CH1PRA so it was COVID-safe and fulfilled strict accreditation criteria.
• Redesign the experiments so as many students as possible could complete practical work by converting some experiments so they were suitable for completion on the open bench to maximise laboratory capacity
• Redesign assessments so if students missed sessions due to COVID they could still collect credit
• Minimise assessment load on academic staff and students
• Move to a more skills-based assessment paradigm, away from the traditional laboratory report.

Context

As mentioned earlier, the COVID pandemic led to significant difficulties in the provision of a practical class due to restrictions on the number of students allowed within the laboratory; 12 students in the fumehoods and 12 students on the open bench (rather than up to 74 students all using fumehoods previously). Prior to the redesign, each student completed four or five assessments per 5-week block and all of the assessments related to a laboratory-based experiment. In addition, the majority of the assessments required students to complete a pro-forma or a technical report. We noticed that the pro-formas did not encourage students to engage with the experiments as we intended, therefore execution of the experiment was passive. The technical reports placed a significant marking burden upon the academic staff and each rotation had different requirements for the content of the report, leading to confusion and frustration among the students. The reliance of the assessments upon completion of a practical experiment was also deemed high-risk with the advent of COVID, therefore we had to re-think our assessment and practical experiment regime.

Implementation

In most cases, the COVID-safe bench experiments were adapted from existing procedures, allowing processing of 24 students per week (12 on the bench and 12 in the fumehood), with students completing two practical sessions every five weeks. This meant that technical staff did not have to familiarise themselves with new experimental procedures while implementing COVID guidelines. In addition, three online exercises per rotation were developed, requiring the same amount of time as the practical class to complete therefore fulfilling our accreditation requirements. The majority of assessments were linked to the ‘online practicals’, with opportunities for feedback during online drop-in sessions. This meant that if a student had to self-isolate they could still complete the assessments within the deadline, reducing the likelihood of ECF submissions and ensuring all Learning Outcomes would still be met. To reduce assessment burden on staff and students, each 5-week block had three assessment points and where possible one of these assessments was marked automatically, e.g. using a Blackboard quiz. The assessments themselves were designed to be more skills-based, developing the softer skills students would require upon employment or during a placement. To encourage active learning, the use of reflection was embedded into the assessment regime; it was hoped that by critically appraising performance in the laboratory students would remember the skills and techniques that they had learnt better rather than the “see, do, forget” mentality that is prevalent within practical classes.

Examples of assessments include: undertaking data analysis, focussing on clear presentation of data; critical self-reflection of the skills developed during a practical class i.e. “what went well”, “what didn’t go so well”, “what would I do differently?”; critically engaging with a published scientific procedure; and giving a three-minute presentation about a practical scientific technique commonly-encountered in the laboratory.

Impact

Mid-module evaluation was completed using an online form, providing some useful feedback that will be used to improve the student experience next term. The majority of students agreed, or strongly agreed, that staff were friendly and approachable, face-to-face practicals were useful and enjoyable, the course was well-run and the supporting materials were useful. This was heartening to read, as it meant that the adjustments that we had to make to the delivery of laboratory based practicals did not have a negative impact upon the students’ experience and that the re-design was, for the most part, working well. Staff enjoyed marking the varied assessments and the workload was significantly reduced by using Blackboard functionality.

Reflections

To claim that all is perfect with this redesign would be disingenuous, and there was a slight disconnect between what we expected students to achieve from the online practicals and what students were achieving. A number of the students polled disliked the online practical work, with the main reason being that the assessment requirements were unclear. We have addressed by providing additional videos explicitly outlining expectations for the assessments, and ensuring that all students are aware of the drop-in sessions. In addition, we amended the assessments so they are aligned more closely with the face-to-face practical sessions giving students opportunity for informal feedback during the practical class.

In summary, we are happy that the assessments are now more varied and provide students with the skills they will need throughout their degree and upon graduation. In addition, the assessment burden on staff and students has been reduced. Looking forward, we will now consider the experiments themselves and in 2021/22 we will extend the number of hours of practical work that Part 1 students complete and further embed our skill-based approach into the programme.

Follow up

 

Links and References

Misconceptions About Flipped Learning

Misconceptions about Flipped Learning

 

During the COVID-19 pandemic, colleagues in UoR are called to adjust their courses almost overnight from face to face teaching and to fully online ones. As the immediate future is still full of uncertainty, UoR (2020) teaching and learning framework are asking us to be creative in our pedagogical teaching approaches and to come up with strategies that would make courses stimulating and engaging. Flipped learning is one of the approaches suggested in the framework. With that in mind, I have written two articles about flipped learning published here and here.

Flipped learning is a pedagogical approach which comes timely during Covid-19. The advancement of internet technology, online learning platform and social media combined with growing exposure to flipped learning pedagogical approach promote the adoption of flipped learning during this pandemic. However, despite its popularity and published literature about flipped learning, it is evident that there are many misconceptions about it as it remains a somewhat poorly-understood concept among many.

In this last article, I thought I write and share with you some of the misconceptions about flipped learning that I resonate most. At the same time, let us reflect on them and see how we can overcome them if possible. Your feedbacks are always welcome and please do send me your thoughts via w.tew@henley.ac.uk

 

Misconception 1: Flipped learning is about putting video contents online

Reflection: This can be the most popular format to do flipped learning, but it is NOT about putting videos online and having students do homework in class (or online during this pandemic time). Referring to UoR (2020) Teaching and Learning: Framework for Autumn term 2020, we are encouraged to prepare our teaching and lectures in a video format. This format works well with flipped learning instructional strategy for delivering our teaching contents but flipped learning can be about much more than that. Colleagues can opt for videos or just text (readings) materials if they flip their lessons. For example, we can make good use of BB LMS platform to include online reading materials using talis aspire, journal articles, case studies, news that are relevant for our students. In another word, flipped learning does not necessarily use videos entirely.

 

Misconception 2: You need to be in the video

Reflection: This is not necessary the case especially so many of us are just shy and ‘unnatural’ in front of the camera, just how I feel for myself. This is why voice recorded PowerPoint format can be a ‘lifesaver’ to many of us. Having said that, having you in the video adds a personal touch to the learning materials for students. For example, wearing different hats when you are filming your videos make it more interesting to ‘draw’ students’ attention to your contents and lessons. Try it, you probably earn a “Mad hatter” title from your students. Just one of my crazy ideas.

 

Misconception 3: You need to flip your entire module 

ReflectionMany of us assume that we need to flip it for our entire module for entire academic year. NOT entirely necessarily so! The whole idea about flipped learning is to foster student-centred learning and teaching can be personalised to suit the students’ needs and learning pace. Therefore, you can flip just one concept or topic, one entire term or some weeks. Remember, the focus is on the students’ learning needs – one size fits all approach definitely does not fits in a flipped learning environment.

 

Misconception 4Flipped learning is a fad and people has been doing this for years in the past

Reflection: This is what my initial thought when I first come to know about flipped learning. A fad is defined as “a style, activity, or interest that is very popular for a short period of time”, an innovation that never takes hold. Flipped learning is anything but this. The evidence that it is still actively studied and researched today proves that it is not just a fad. Talbert (2017) argued that flipped learning is not just rebranding of old techniques. Flipped learning has its pedagogical framework and values in its effects on learning. In brief, the definition of flipped learning (refer Flipped Learning Network, 2014) has differentiated it with any learning theories.

 

Misconception 5: Flipping the classroom takes too much time

Reflection: To be honest, I do think this is true. Preparing for flipped learning and flipping the lessons involve a lot of energy and time. Based on my own experience, I personally can testify that it can take a significant amount of time. This also subjects to how tech-savvy is the teacher and how much of the teaching content needs to be flipped. However, the fruit of the hard labour and time investment, once designed, it will save time. Irony, isn’t it. That’s my experience. What I am trying to show you that once you have it done, you will be able to use the same content over and over again, year after year. Then, any updating and changes to the contents will not take as much time as creating everything from scratch again.

Finally, I hope you enjoy my series of flipped learning published on this platform. I sincerely urge you to consider flipped learning pedagogical approach during this pandemic and please do not hesitate to be in touch to continue this conversation.

References

Flipped Learning Network (FLN). (2014) The Four Pillars of F-L-I-P™ , Reproducible PDF can be found at www.flippedlearning.org/definition.

Talbert, R (2017) Flipped Learning: A Guide for Higher Education Faculty. Stylus Publishing, LLC

UoR (2020) Teaching and Learning: Framework for Autumn term 2020, available at: https://www.reading.ac.uk/web/files/leadershipgroup/autumn-teaching-proposal-v11.pdf

 

Introducing group assessment to improve constructive alignment: impact on teacher and student

Daniela Standen, School Director of Teaching and Learning, ISLI  Alison Nicholson, Honorary Fellow, UoR

Overview

In summer 2018-19 Italian and French in Institution-wide Language Programme, piloted paired Oral exams. The impact of the change is explored below. Although discussed in the context of language assessment, the drivers for change, challenges and outcomes are relevant to any discipline intending to introduce more authentic and collaborative tasks in their assessment mix. Group assessments constitute around 4% of the University Assessment types (EMA data, academic year 2019-20).

Objectives

  • improve constructive alignment between the learning outcomes, the teaching methodology and the assessment process
  • for students to be more relaxed and produce more authentic and spontaneous language
  • make the assessment process more efficient, with the aim to reduce teacher workload

Context

IWLP provides credit-bearing language learning opportunities for students across the University. Around 1,000 students learn a language with IWLP at Reading.

The learning outcomes of the modules talk about the ability to communicate in the language.  The teaching methodology employed favours student–student interaction and collaboration.  In class, students work mostly in pairs or small groups. The exam format, on the other hand, was structured so that a student would interact with the teacher.

The exam was often the first time students would have spoken one-to-one with the teacher. The change in interaction pattern could be intimidating and tended to produce stilted Q&A sessions or interrogations, not communication.

Implementation

Who was affected by the change?

221 Students

8 Teachers

7 Modules

4 Proficiency Levels

2 Languages

What changed?

  • The interlocution pattern changed from teacher-student to student-student, reflecting the normal pattern of in-class interaction
  • The marking criteria changed, so that quality of interaction was better defined and carried higher weight
  • The marking process changed, teachers as well as students were paired. Instead of the examiner re-listening to all the oral exams in order to award a mark, the exams were double staffed. One teacher concentrated on running the exam and marking using holistic marking criteria and the second teacher listened and marked using analytic rating scales

Expected outcomes

  • Students to be more relaxed and produce more authentic and spontaneous language
  • Students to student interaction creates a more relaxed atmosphere
  • Students take longer speaking turns
  • Students use more features of interaction

(Hardi Prasetyo, 2018)

  • For there to be perceived issues of validity and fairness around ‘interlocutor effects’ i.e. how does the competence of the person I am speaking to affect my outcomes. (Galaczi & French, 2011)

 Mitigation

  • Homogeneous pairings, through class observation
  • Include monologic and dialogic assessment tasks
  • Planned teacher intervention
  • Inclusion of communicative and linguistic marking criteria
  • Pairing teachers as well as students, for more robust moderation

Impact

Methods of evaluation

Questionnaires were sent to 32 students who had experienced the previous exam format to enable comparison.  Response rate was 30%, 70% from students of Italian. Responses were consistent across the two languages.

8 Teachers provided verbal or written feedback.

 Students’ Questionnaire Results

Overall students’ feedback was positive.  Students recognised closer alignment between teaching and assessment, and that talking to another student was more natural. They also reported increased opportunities to practise and felt well prepared.

However, they did not feel that the new format improved their opportunity to demonstrate their learning or speaking to a student more relaxing.  The qualitative feedback tells us that this is due to anxieties around pairings.

Teachers’ Feedback

  • Language production was more spontaneous and authentic. One teacher commented ‘it was a much more authentic situation and students really helped each other to communicate’
  • Marking changed from a focus on listening for errors towards rewarding successful communication
  • Workload decreased by up to 30%, for the average student cohort and peaks and troughs of work were better distributed

Reflections

Overall, the impact on both teachers and students was positive. Student reported that they were well briefed and had greater opportunities to practise before the exam. Teachers reported a positive impact on workloads and on the students’ ability to demonstrate they were able to communicate in the language.

However, this was not reflected in the students’ feedback. There is a clear discrepancy in the teachers and students’ perception of how the new format allows students to showcase learning.

Despite mitigating action being taken, students also reported anxiety around ‘interlocutor effect’.  Race (2014) tells us that even when universities have put all possible measures in place to make assessment fair they often fail to communicate this appropriately to students. The next steps should therefore focus on engaging students to bridge this perception gap.

Follow-up

Follow up was planned for the 2019-20 academic cycle but could not take place due to the COVID-19 pandemic.

References

Galaczi & French, in Taylor, L. (ed.), (2011). Examining Speaking: Research and practice in assessing second language speaking. Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo, Dehli, Tokyo, Mexico City: CUP.

Fulcher, G. (2003). Testing Second Language Speaking. Ediburgh: Pearson.

Hardi Prasetyo, A. (2018). Paired Oral Tests: A literature review. LLT Journal: A Journal on Language and Language Teaching, 21(Suppl.), 105-110.

Race, P. (2014) Making Learning happen (3rd ed.), Los Angeles; London: Sage

Race, P. (2015) The lecturer’s toolkit : a practical guide to assessment, learning and teaching (4th ed.), London ; New York, NY : Routledge, Taylor & Francis Group

 

How ISLI moved to full online teaching in four weeks

Daniela Standen, ISLI

Overview

ISLI teaches almost exclusively international students. Many of our programmes run all year round, so ISLI had to move to teach exclusively online in the Summer Term. This case study outlines the approach taken and some of the lessons learnt along the way. 

Objectives 

  • Delivering a full Pre-sessional English Programme online to 100 students.
  • Providing academic language and literacy courses for international students.
  • Teaching International Foundation students, with one cohort about to begin their second term at Reading.
  • Teaching students on the Study Abroad Programme.

Context  

In April 2020 as the country was into lockdown and most of the University had finished teaching, ISLI was about to start a ‘normal’ teaching term.  The Pre-sessional English Programme was about to welcome 100 (mostly new) students to the University. The January entry of the International Foundation Programme was less than half-way through their studies and the Academic English Programme was still providing language and academic literacy support to international students.

Implementation

Moving to online teaching was greatly facilitated by having in house TEL expertise as well as colleagues with experience of online teaching, who supported the upskilling of ISLI academic staff and were able to advise on programme, module and lesson frameworks.

We thought that collaboration would be key, so we put in place numerous channels for cross-School working to share best practice and tackle challenges.  ISLI TEL colleagues offered weekly all School Q&A sessions as well as specific TEL training. We set up a Programme Directors’ Community of Practice that meets weekly; and made full use of TEAMS as a space where resources and expertise could be shared.  Some programmes also created a ‘buddy system for teachers’.

Primarily the School adopted an asynchronous approach to teaching, synchronous delivery was made particularly difficult by having students scattered across the globe.  We used a variety of tools from videos, screencasts, narrated PowerPoints and Task & Answer documents to full Xerte lessons.  Generally using a variety of the above to build a lesson.  Interactive elements were provided initially mostly asynchronously, using discussion boards, Padlet and Flipgrid.  However, as the term progressed feedback from students highlighted a need for some synchronous delivery, which was carried out using Blackboard collaborate and TEAMS. 

Impact

It has not been easy, but there have been many positive outcomes from having had to change our working practices.  Despite the incredibly short timescales and the almost non-existent preparation timel, our PSE 3 students started and successfully finished their programme completely online, the IFP January entry students are ready to start their revision weeks before sitting their exams in July and international students writing dissertations and post graduate research were supported throughout the term.

As a School we have learnt new skills and to work in ways that we may not have thought possible had we not been forced into them.  These new ways of working have fostered cross-School collaboration and sharing of expertise and knowledge.

Reflections

We have learnt a lot in the past three months.  On average it takes a day’s work to transform one hour of face to face teaching into a task-based online lesson.

Not all TEL tools are equally effective and efficient, below are some of our favourites:

  • For delivering content: Narrated PowerPoints, Screen casts, Webinars, Task and Answer (PDF/Word Documents)
  • For building online communities: Live sessions on BB collaborate (but students are sometimes shy to take part in breakout group discussions), Flipgrip, discussion boards.
  • For student engagement: BB retention centre, Tutorials on Teams, small frequent formative assignments/tasks on Blackboard Assignments.
  • For assessment: BB assignments, Turn it in, Teams for oral assessment

If time were not a consideration Xerte would also be on the list.

Copyright issues can have a real impact on what you can do when delivering completely online.  Careful consideration also needs to be given when linking to videos, particularly if you have students that are based in China.

Follow up

ISLI is now preparing for Summer PSE, which starts at the end of June. Many of the lessons learnt this term have fed into preparation for summer and autumn teaching.  In particular, we have listened to our students, who told us clearly that face-to-face interaction even if ‘virtual’ is really important and have included more webinars and Blackboard Collaborate sessions in our programmes.

Links

https://www.reading.ac.uk/ISLI/  

Taking Academic Language and Literacy Courses Online

Dr Karin Whiteside, ISLI

Overview

Alongside its embedded discipline-specific provision, the Academic English Programme (AEP) offers a range of open sign-up academic language and literacy courses each term. This case study outlines the process of rapidly converting the summer term provision online, and reports student feedback and reflections on the experience which will help inform continued online delivery this autumn term.

Objectives

Our aim was to provide academic language and literacy support which, as far as practicably possible, was equivalent in scope and quality to our normal face-to-face offering for the same time of year. In summer term, our provision is particularly important for master’s students working on their dissertations, with high numbers applying for Dissertation & Thesis Writing, but courses such as Core Writing Skills and Academic Grammar also providing important ‘building block’ input needed for competent research writing.

Context

Prior to the COVID crisis, our face-to-face courses on different aspects of written and spoken Academic English have been offered for open application on a first-come-first served basis, with a rolling weekly waiting list. With a maximum of 20 students per class, we have been able to offer interactive, task-based learning involving analysis of target language and communicative situations in context, practice exercises and opportunity for discussion and feedback within a friendly small-group environment.

Implementation

Within an extremely tight turnaround time of four weeks to achieve this, we determined a slightly slimmed down programme of five ‘open-to-all’ online courses –  Academic Grammar, Core Academic Writing Skills, Dissertation & Thesis Writing, Essays: Criticality, Argument, Structure and Listening & Note-taking – and replaced our normal application process with self-enrolment via Blackboard, meaning uncapped numbers could sign up and have access to lessons.

Time restraints meant we had to be pragmatic in terms of where to focus our energies. Conversion of course content online needed to be done in a way that was both effective and sustainable, thinking of the potential continued need for online AEP provision going into 2020/21. We predicted (rightly!) that the process of initially converting small-group interactive learning materials to an online format in which their inductive, task-based qualities were retained would be labour-intensive and time-consuming. Therefore, for the short term (summer 2020) we adopted a primarily asynchronous approach, with a view to increasing the proportion of synchronous interactivity in future iterations once content was in place. In terms of converting face-to-face lessons to online, we found what often worked most effectively was to break down contents of a two-hour face-to-face lesson into 2-3 task-focused online parts, each introduced and concluded with short, narrated PowerPoints/MP4 videos. We determined a weekly release-date for lesson materials on each course, often accompanied by a ‘flipped’ element, labelled ‘Pre-lesson Task’, released a few days prior to the main lesson materials. We set up accompanying weekly Discussion Forums where students could ask questions or make comments, for which there was one ‘live’ hour per week. Apart from Pre-Lesson Tasks, task answers were always made available at the same time as lessons to allow students complete autonomy.

Moving rapidly to online delivery meant not necessarily having the highest specification e-learning tools immediately to hand but instead working creatively to get the best out of existing technologies, including the Blackboard platform, which prior to this term had had a mainly ‘depository’ function in AEP. To ensure ease of navigation, the various attachments involved in creating such lessons needed to be carefully curated by Folder and Item within BB Learning Materials. Key to this was clear naming and sequencing, with accompanying instructions at Folder and Item level.

Impact, Reflections and Follow-up

Positive outcomes of taking the summer AEP provision online have included noticeably higher uptake (e.g. in Academic Grammar, 92 self-enrolments compared to 30 applications in summer term 2018/19) and noticeably higher real engagement (e.g. with an average of 11 students attending the 2018/19 summer face-to-face Academic Grammar class, compared to a high of 57 and average of 38 students accessing each online lesson). Running the courses asynchronously online has meant no waiting lists, allowing access to course content to all students who register interest. It also means that students can continue to join courses and work through materials over the summer vacation period, which is particularly useful for international master’s students working on Dissertations for September submission, and for cohorts overseas such as the IoE master’s students in Guangdong.

In survey responses gathered thus far, response to course content has been largely positive: “It provided me an insight into what is expected structure and criticality. Now that I am writing my essay, I could see the difference”. Students appreciated teacher narration, noticing if it was absent: “I would prefer our teacher to talk and explain the subject in every slide.” The clarity of lesson presentation within Blackboard was also noted: “I think the most impressive part in this course is the way these lessons were arranged in BB as every lessons were explicitly highlighted, divided into parts with relevant tasks and their answers. Thus, I could effectively learn the content consciously and unconsciously.”

There were a range of reactions to our approach to online delivery and to online learning more generally.  52% of students were happy with entirely asynchronous learning, while 48% would have preferred a larger element of real-time interactivity: “Although this lessons ensured the freedom in dealing with the material whenever it was possible, the lack of a live-scheduled contact with the teacher and other students was somewhat dispersive.”; “I prefer face to face in the classroom because it encourages me more to contribute”. In normal circumstances, 34% of students said they would want entirely face-to-face AEP classes, whilst 21% would want a blended provision and 45% would prefer learning to remain entirely online, with positive feedback regarding the flexibility of the online provision: “it’s flexible for students to do it depending on their own time.”; “Don’t change the possibility to work asynchronously. It makes it possible to follow despite being a part time student.” Going forward, we plan to design in regular synchronous elements in the form of webinars which link to the asynchronous spine of each course to respond to students’ requests for more live interactivity. We also plan to revisit and refine our use of Discussion Forums in Blackboard. Whilst engagement of lesson content was high, students made limited use of Q&A Forums. It is hoped that more targeted forums directly linked to flipped tasks will encourage greater engagement with this strand of the online delivery in the future.

Links

The AEP website ‘Courses, Workshops and Webinars’ page, which gives details of this summer term’s courses and what will be on offer in autumn: http://www.reading.ac.uk/ISLI/enhancing-studies/academic-english-programme/isli-aep-courses.aspx

Developing Diversity and Inclusion teaching: The importance of D&I and Ethical Practice

Dr Allán Laville, Psychology and Clinical Language Sciences, a.laville@reading.ac.uk

Overview

In the training of Psychological Wellbeing Practitioners (PWPs), teaching must include a focus on Diversity and Inclusion (D&I) as well how this relates to ethical practice. Therefore, I created a 15-minute screencast that tied key D&I principles to clinical practice, with a particular focus on ethical practice within this area.

Objectives

  1. To support students in being aware of key D&I and ethical principles and how these principles relate to their clinical practice.
  2. To support students in writing a 500-word reflective piece on the importance of considering D&I in their ethically-sound, clinical practice.

Context

PWP programmes include D&I training within the final module of the clinical programme, but to meet the British Psychological Society (BPS) programme standards, D&I training needs to be incorporated throughout. Furthermore, this training should be tied to the BPS programme standard on Ethical Practice teaching (Module PY3EAA1/PYMEAA).

Implementation

The first step was to identify the key sources to include within the screencast. These were wide ranging from legislation (Equality Act, 2010), positive practice guides (Improving Access to Psychological Therapies) and ethical practice guidelines (British Psychological Society) and reference to the University’s Fitness to Practise policy.

The second step was to think about how students could engage with the screencast in a meaningful way. Based on an earlier T&L Exchange project report of mine (https://sites.reading.ac.uk/t-and-l-exchange/2019/07/23/developing-innovative-teaching-the-importance-of-reflective-practice/), I wanted to include an element of reflective practice. Students were asked to write a 500-word reflective piece on their own take-home points from the screencast and preferably, following the Rolfe, Freshwater, and Jasper (2001) reflective model of: a) what is being considered, b)  so what, which I say to my students is the ‘why care?’ part! And c) now what i.e. from reviewing what and so what, detailing your SMART action plan for future clinical practice.

Example by Will Warley, Part 3 MSci Applied Psychology (Clinical) student.

Impact

The student feedback about the screencast and completing the reflective piece has been very positive. This has been across both the MSci in Applied Psychology (Clinical) as well as the Charlie Waller Institute (CWI), PG (Cert) in Evidence-Based Psychological Treatments (IAPT Pathway). The training materials have also been shared with members of the SPCLS Board of Studies for CWI training programmes.

In regard to national level impact, I have presented this innovative approach to D&I teaching at the BPS Programme Liaison Day, which included the BPS PWP Training Committee and Programme Directors from across the UK. The presentation was received very well including requests to disseminate the materials that we use in the teaching at UoR. Therefore, these materials have now been circulated to all PWP training providers in the UK to inform their D&I provision.

Reflections

One core reason for the success of this activity was the commitment and creativity of our students! Some students used software to create excellent mind maps, interactive presentations or a YouTube video! There was even an Instagram account used to illustrate the main take-home points from the screencast, which I thought was particularly innovative. Overall, I was absolutely delighted to see such high levels of student engagement with topics that are so important – both personally and professionally.

In regard to better implementation, it is possible that slightly more guidance could have been provided regarding how to approach the reflective task, but the brief of ‘be as creative as possible!’ worked very well indeed!

Follow up

I will be following up with the BPS PWP Training Committee in 2020 to see how this activity has developed within other PWP training providers! We will then create a summary of all innovative approaches to including D&I in PWP programmes and how these meet the programme standards.

Links

https://my.cumbria.ac.uk/media/MyCumbria/Documents/ReflectiveModelRolfe.pdf

Student YouTube video as submission on reflective task: https://youtu.be/hMU6F_dknP4

Using Flipped Learning to Meet the Challenges of Large Group Lectures

Adopting a flipped classroom approach to meet the challenges of large group lectures

Name/School/ Email address

Amanda Millmore / School of Law / a.millmore@reading.ac.uk

Overview

Faced with double-teaching a cohort of 480 students (plus an additional 30 in University of Reading Malaysia), I was concerned to ensure that students in each lecture group had a similar teaching experience. My solution was to “flip” some of the learning, by recording short video lectures covering content that I would otherwise have lectured live and to use the time freed up to slow the pace and instigate active learning within the lectures. Students provided overwhelmingly positive feedback in formal and informal module evaluations, the introduction of flipped learning has aided the welfare of students, allowing those who are absent or who have disabilities or language barriers to revisit material as and when needed. For staff, it has aided the reduction in my workload and has the ongoing benefit of reducing workload of colleagues who have taken over teaching the module.

Objectives

  • Record short video lectures to supplement live lectures.
  • Use the time freed up by the removal of content no longer delivered live to introduce active learning techniques within the lectures.
  • Support the students in their problem-solving skills (tested in the end of year examination).

Context

The module “General Introduction to Law” is a “lecture only” first year undergraduate module, which is mandatory for many non-law students, covering unfamiliar legal concepts. Whilst I have previously tried to introduce some active learning into these lectures, I have struggled with time constraints due to the sheer volume of compulsory material to be covered.

Student feedback requested more support in tackling legal problem questions, I wanted to assist students and needed to free up some space within the lectures to do this and “flipping” some of the content by creating videos seemed to offer a solution.

As many academics (Berrett, 2012; Schaffzin, 2016) have noted, there is more to flipping than merely moving lectures online, it is about a change of pedagogical approach.

Implementation

I sought initial support from the TEL (Technology Enhanced Learning) team, who were very happy to give advice about technology options. I selected the free Screencast-O-Matic software, which was simple to use with minimal equipment (a headset with microphone plugged into my computer).

I recorded 8 short videos, which were screencasts of some of my lecture slides with my narration; 6 were traditional lecture content and 2 were problem solving advice and modelling an exemplar problem question and answer (which I had previously offered as straightforward read-only documents on Blackboard).

The software that I used restricted me to 15 minute videos, which worked well for maintaining student attention. My screencast videos were embedded within the Blackboard module and could also be viewed directly on the internet https://screencast-o-matic.com/u/iIMC/AmandaMillmoreGeneralIntroductiontoLaw.

I reminded students to watch the videos via email and during the lectures, and I was able to track the number of views of each video, which enabled me to prompt students if levels of viewing were lower than I expected.

By moving some of the content delivery online I was also able to incorporate more problem-solving tasks into the live lectures. I was able to slow the pace and to invite dialogue, often by using technology enhanced learning. For example, I devoted an hour to tackling an exam-style problem, with students actively working to solve the problem using the knowledge gained via the flipped learning videos and previous live lectures. I used the applications Mentimeter, Socrative and Kahoot to interact with the students, asking them multiple-choice questions, encouraging them to vote on questions and to create word clouds of their initial thoughts on tackling problem questions as we progressed.

Evaluation

I evaluated reaction to the module using the usual formal and informal module evaluations. I also tracked engagement with the videos and actively used these figures to prompt students if views were lower than expected. I monitored attendance to modules and didn’t notice any drop-off in attendance. Finally, I reviewed end of year results to assess impact on students results.

Impact

Student feedback, about the videos and problem solving, was overwhelmingly positive in both formal and informal module evaluations.

Videos can be of assistance if a student is absent, has a disability or wishes to revisit the material. Sankoff (2014) and Billings-Gagliardi and Mazor (2007) dismiss concerns about reduced student attendance due to online material, and this was borne out by my experience, with no noticeable drop-off in numbers attending lectures; I interpret this as a positive sign of student satisfaction. The videos worked to supplement the live lectures rather than replace them.

There is a clear, positive impact on my own workload and that of my colleagues. Whilst I am no longer teaching on this module, my successor has been able to use my videos again in her teaching, thereby reducing her own workload. I have also been able to re-use some of the videos in other modules.

Reflections

Whilst flipped learning is intensive to plan, create and execute, the ability to re-use the videos in multiple modules is a huge advantage; short videos are simple to re-record if, and when, updating is required.

My initial concern that students would not watch the videos was utterly misplaced. Each video has had in excess of 1200 views (and one video has exceeded 2500). Some of the material was only covered by the flipped learning videos, and still appeared within the examination; students who tackled those questions did equally well as those answering questions covering content which was given via live lecture, but those questions were less popular (2017/18 examination).

I was conscious that there may be some students who would just ignore the videos, thereby missing out on chunks of the syllabus, I tried to mitigate this by running quizzes during lectures on the recorded material, and offering banks of multiple choice questions (MCQs) on Blackboard for students to test their knowledge (aligned to the summative examination which included a multiple choice section). In addition, I clearly signposted the importance of the video recorded material by email, on the Blackboard page and orally and emphasised that it would form part of the final examination and could not be ignored.

My experience echoes that of Schaffzin’s study (2016) monitoring impact, which showed no statistical significance in law results having instituted flipped learning, although she felt that it was a more positive teaching method. Examination results for the module in the end of year summative assessment (100% examination) were broadly consistent with the results in previous academic years, but student satisfaction was higher, with positive feedback about the use of videos and active learning activities.

Follow Up

Since creating the flipped learning videos another colleague has taken over as convenor and continued to use the videos I created. Some of the videos have also been able to be used in other modules.  I have used screencast videos in another non-law module, and also used them as introductory material for a large core Part 1 Law module. Student feedback in module evaluations praised the additional material. One evolution in another module was that when I ran out of time to cover working through a past exam question within a lecture, I created a quick screencast which finished off the topic for students; I felt that it was better to go at a more sensible pace in the lecture and use the screencast rather than rush through the material.

Michelle Johnson, Module Convenor 2018-2019 commented that:

“I have continued to use and expand the flipped learning initiative as part of the module and have incorporated further screencasts into the module in relation to the contract law content delivered. This allowed for additional time on the module to conduct a peer-assessment exercise focussed on increasing the students’ direct familiarity with exam questions and also crucially the marking criteria that would be used to score their Summer exams. Students continue to be very positive about the incorporation of flipped learning material on the module and I feel strongly that it allowed the students to review the more basic introductory content prior to lectures, this allowing time for a deeper engagement with the more challenging aspects of the lectures during lecture time. This seemed to improve students understanding of the topics more broadly, allowing them to revisit material whenever they needed and in a more targeted way than a simple lecture recording.”

TEF

TQ1, LE1, SO3

Links

University of Reading TEL advice about personal capture – https://sites.reading.ac.uk/tel-support/category/learning-capture/personal-capture

Berrett, D. (2012). How “Flipping” the Classroom Can Improve the Traditional Lecture. – https://www.chronicle.com/article/how-flipping-the-classroom/130857. Chronicle of Higher Education..

Billings-Gagliardi, S and Mazor, K. (2007) Student decisions about lecture attendance: do electronic course materials matter?. Academic Medicine: Journal of the Association of American Medical Colleges, 82(10), S73-S76.

Sankoff, P. (2014) Taking the Instruction of Law outside the Lecture Hall: How the Flipped Classroom Can Make Learning More Productive and Enjoyable (for Professors and Students), 51, Alberta Law Review, pp.891-906.

Schaffzin, K. (2016) Learning Outcomes in a Flipped Classroom: A comparison of Civil Procedure II Test Scores between Students in a Traditional Class and a Flipped Class, University of Memphis Law Review, 46, pp. 661.

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 2)

Dr Madeleine Davies and Michael Lyons, School of Literature and Languages

Overview

The Department of English Literature (DEL) has run two student focus groups and two whole-cohort surveys as part of our Teaching and Learning Development Fund‘Diversifying Assessments’ project. This is the second of two T&L Exchange entries on this topic. Click here for the first entry which outlines how the feedback received from students indicates that their module selection is informed by the assessment models that are used by individual modules. Underpinning these decisions is an attempt to avoid the ‘stress and anxiety’ that students connect with exams. The surprise of this second round of focus groups and surveys is the extent to which this appears to dominate students’ teaching and learning choices.

Objectives

  • The focus groups and surveys are used to gain feedback from DEL students about possible alternative forms of summative assessment to our standard assessed essay + exam model. This connects with the Curriculum Framework in its emphasis on Programme Review and also with the aims of the Assessment Project.
  • These forms of conversations are designed to discover student views on the problems with existing assessment patterns and methods, as well as their reasons for preferring alternatives to them.
  • The conversations are also being used to explore the extent to which electronic methods of assessment can address identified assessment problems.

Context

Having used focus groups and surveys to provide initial qualitative data on our assessment practices, we noticed a widespread preference for alternatives to traditional exams (particularly the Learning Journal), and decided to investigate the reasons for this further. The second focus group and subsequent survey sought to identify why the Learning Journal in particular is so favoured by students, and we were keen to explore whether teaching and learning aims were perceived by students to be better achieved via this method than by the traditional exam. We also took the opportunity to ask students what they value most in feedback: the first focus group and survey had touched on this but we decided this time to give students the opportunity to select four elements of feedback which they could rank in order or priority. This produced more nuanced data.

Implementation

  • A second focus group was convened to gather more detailed views on the negative attitudes towards exams, and to debate alternatives to this traditional assessment method.
  • A series of questions was asked to generate data and dialogue.
  • A Survey Monkey was circulated to all DEL students with the same series of questions as those used for the focus group in order to determine whether the focus group’s responses were representative of the wider cohort.
  •  The Survey Monkey results are presented below. The numbers refer to student responses to a category (eg. graphic 1, 50 students selected option (b). Graphic 2 and graphic 5 allowed students to rank their responses in order or priority.

Results

  • Whilst only 17% in the focus group preferred to keep to the traditional exam + assessed essay method, the survey found the aversion to exams to be more prominent. 88% of students preferred the Learning Journal over the exam, and 88% cited the likelihood of reducing stress and anxiety as a reason for this preference.
  • Furthermore, none of the survey respondents wanted to retain the traditional exam + assessed essay method, and 52% were in favour of a three-way split between types of assessment; this reflects a desire for significant diversity in assessment methods.
  • We find it helpful to know precisely what students want in terms of feedback: ‘a clear indication of errors and potential solutions’ was the overwhelming response. ‘Feedback that intersects with the Module Rubric’ was the second highest scorer (presumably a connection between the two was identified by students).
  • The students in the focus group mentioned a desire to choose assessment methods within modules on an individual basis. This may be one issue in which student choice and pedagogy may not be entirely compatible (see below).
  • Assessed Essay method: the results seem to indicate that replacing an exam with a second assessed essay is favoured across the Programme rather than being pinned to one Part.

Reflections

The results in the ‘Feedback’ sections are valuable for DEL: they indicate that clarity, diagnosis, and solutions-focused comments are key. In addressing our feedback conventions and practices, this input will help us to reflect on what we are doing when we give students feedback on their work.

The results of the focus group and of the subsequent survey do, however, raise some concerns about the potential conflict between ‘student choice’ and pedagogical practice. Students indicate that they not only want to avoid exams because of ‘stress’, but that they would also like to be able to select assessment methods within modules. This poses problems because marks are in part produced ‘against’ the rest of the batch: if the ‘base-line’ is removed by allowing students to choose assessment models, we would lack one of the main indicators of level.

In addition, the aims of some modules are best measured using exams. Convenors need to consider whether a student’s work can be assessed in non-exam formats but, if an exam is the best test of teaching and learning, it should be retained, regardless of student choice.

If, however, students overwhelmingly choose non-exam-based modules, this would leave modules retaining an exam in a vulnerable position. The aim of this project is to find ways to diversify our assessments, but this could leave modules that retain traditional assessment patterns vulnerable to students deselecting them. This may have implications for benchmarking.

It may also be the case that the attempt to avoid ‘stress’ is not necessarily in students’ best interests. The workplace is not a stress-free zone and it is part of the university’s mission to produce resilient, employable graduates. Removing all ‘stress’ triggers may not be the best way to achieve this.

Follow up

  • DEL will convene a third focus group meeting in the Spring Term.
  • The co-leaders of the ‘Diversifying Assessments’ project will present the findings of the focus groups and surveys to DEL in a presentation. We will outline the results of our work and call on colleagues to reflect on the assessment models used on their modules with a view to volunteering to adopt different models if they think this appropriate to the teaching and learning aims of their modules
  • This should produce an overall assessment landscape that corresponds to students’ request for ‘three-way’ (at least) diversification of assessment.
  • The new landscape will be presented to the third focus group for final feedback.

Links

With thanks to Lauren McCann of TEL for sending me the first link which includes a summary of students’ responses to various types of ‘new’ assessment formats.

https://www.facultyfocus.com/articles/online-education/assessment-strategies-students-prefer/

Conclusions (May 2018)

The ‘Diversifying Assessment in DEL’ TLDF Mini-Project revealed several compelling reasons for reflecting upon assessment practice within a traditional Humanities discipline (English Literature):

  1. Diversified cohort: HEIs are recruiting students from a wide variety of socio-cultural, economic and educational backgrounds and assessment practice needs to accommodate this newly diversified cohort.
  2. Employability: DEL students have always acquired advanced skills in formal essay-writing but graduates need to be flexible in terms of their writing competencies. Diversifying assessment to include formats involving blog-writing, report-writing, presentation preparation, persuasive writing, and creative writing produces agile students who are comfortable working within a variety of communication formats.
  3. Module specific attainment: the assessment conventions in DEL, particularly at Part 2, have a standardised assessment format (33% assessed essay and 67% exam). The ‘Diversifying Assessment’ project revealed the extent to which module leaders need to reflect on the intended learning outcomes of their modules and to design assessments that are best suited to the attainment of them.
  4. Feedback: the student focus groups convened for the ‘Diversifying Assessment’ project returned repeatedly to the issue of feedback. Conversations about feedback will continue in DEL, particularly in relation to discussions around the Curriculum Framework.
  5. Digitalisation: eSFG (via EMA) has increased the visibility of a variety of potential digital assessment formats (for example, Blackboard Learning Journals, Wikis and Blogs). This supports diversification of assessment and it also supports our students’ digital skills (essential for employability).
  6. Student satisfaction: while colleagues should not feel pressured by student choice (which is not always modelled on academic considerations), there is clearly a desire among our students for more varied methods of assessment. One Focus Group student argued that fees had changed the way students view exams: students’ significant financial investment in their degrees has caused exams to be considered unacceptably ‘high risk’. The project revealed the extent to which Schools need to reflect on the many differences made by the new fees landscape, most of which are invisible to us.
  7. Focus Groups: the Project demonstrated the value of convening student focus groups and of listening to students’ attitudes and responses.
  8. Impact: one Part 2 module has moved away from an exam and towards a Learning Journal as a result of the project and it is hoped that more Part 2 module convenors will similarly decide to reflect on their assessment formats. The DEL project will be rolled out School-wide in the next session to encourage further conversations about assessment, feedback and diversification. It is hoped that these actions will contribute to Curriculum Framework activity in DEL and that they will generate a more diversified assessment landscape in the School.