‘How did I do?’ Finding new ways to describe the standards of foreign language performance. A follow-up project on the redesign of two marking schemes (DLC)

Rita Balestrini and Elisabeth Koenigshofer, School of Literature and Languages, r.balestrini@reading; e.koenigshofer@reading.ac.uk

Overview

Working in collaboration with two Final Year students, we designed two ‘flexible’, ‘minimalist’ rubric templates usable and adaptable across different languages and levels, to provide a basis for the creation of level specific, and potentially task specific, marking schemes where sub-dimensions can be added to the main dimensions. The two marking templates are being piloted this year in the DLC. The project will feature in this year’s TEF submission.

Objectives

Design, in partnership with two students, rubric templates for the evaluation and feedback of writing tasks and oral presentations in foreign languages which:

  • were adaptable across languages and levels of proficiency
  • provided a more inclusive and engaging form of feedback
  • responded to the analysis of student focus group discussions carried out for a previous TLDF-funded project

Context

As a follow-up to a teacher-learner collaborative appraisal of rubrics used in MLES, now DLC, we designed two marking templates in partnership with two Final Year students, who had participated in the focus groups from a previous project and were employed through Campus Jobs. ‘Acknowledgement of effort’, ‘encouragement’, ‘use of non-evaluative language’, ‘need for and, at the same time, distrust of, objective marking’ were recurrent themes that had emerged from the analysis of the focus group discussions and clearly appeared to cause anxiety for students.

Implementation

We organised a preliminary session to discuss these findings with the two student partners. We suggested some articles about ‘complexity theory’ as applied to second language learning, (Kramsch, 2012; Larsen-Freeman, 2012; 2015a; 2015b; 2017) with the aim of making our theoretical perspective explicit and transparent to them. A second meeting was devoted to planning collaboratively the structure of two marking schemes for writing and presentations. The two students worked independently to produce examples of standard descriptors which avoided the use of evaluative language and emphasised achievement rather than shortcomings. At a third meeting they presented and discussed their proposals with us. At the last meetings, we continued working to finalise the templates and the two visual learning charts they had suggested. Finally, the two students wrote a blog post to recount their experience of this collaborative work.

The two students appreciated our theoretical approach, felt that it was in tune with their own point of view and that it could support the enhancement of the assessment and marking process. They also found resources on their own, which they shared with us – including rubrics from other universities. They made valuable suggestions, gave us feedback on our ideas and helped us to find alternative terms when we were struggling to avoid the use of non-evaluative language for our descriptors. They also suggested making use of some visual elements in the marking and feedback schemes in order to increase immediateness and effectiveness.

Impact

The two marking templates are being piloted this year in the DLC. They were presented to colleagues over four sessions during which the ideas behind their design were explained and discussed. Further internal meetings are planned. These conversations, already begun with the previous TLDF-funded project on assessment and feedback, are contributing to the development of a shared discourse on assessment, which is informed by research and scholarship. The two templates have been designed in partnership with students to ensure accessibility and engagement with the assessment and feedback process. This is regarded as an outstanding practice in the ‘Assessment and feedback benchmarking tool’ produced by the National Union of Students and is likely to feature positively in this year’s TEF submission.

Reflections

Rubrics have become mainstream, especially within certain university subjects like Foreign Languages. They have been introduced to ensure accountability and transparency in marking practices, but they have also created new problems of their own by promoting a false sense of objectivity in marking and grading. The openness and unpredictability of complex performance in foreign languages and of the dynamic language learning process itself are not adequately reflected in the detailed descriptors of the marking and feedback schemes commonly used for the objective numerical evaluation of performance-based assessment in foreign languages. As emerged from the analysis of focus group discussions conducted in the department in 2017, the lack of understanding and engagement with the feedback provided by this type of rubrics can generate frustration in students. Working in partnership with them, rather than simply listening to their voices or seeing them as evaluators of their own experience, helped us to design minimalist and flexible marking templates, which make use of sensible and sensitive language, introduce visual elements to increase immediateness and effectiveness, leave a considerable amount of space for assessors to comment on different aspects of an individual performance and provide ‘feeding forward’ feedback. This type of ‘partnership’ can be challenging because it requires remaining open to unexpected outcomes. Whether it can bring about real change depends on how its outcomes are going to interact with the educational ecosystems in which it is embedded.

Follow up

The next stage of the project will involve colleagues in the DLC who will be using the two templates to contribute to the creation of a ‘bank’ of descriptors by sharing the ones they will develop to tailor the templates for specific stages of language development, language objectives, language tasks, or dimensions of student performance. We also intend to encourage colleagues teaching culture modules to consider using the basic structure of the templates to start designing marking schemes for the assessment of student performance in their modules.

Links

An account written by the two students partners involved in the project can be found here:

Working in partnership with our lecturers to redesign language marking schemes

The first stages of this ongoing project to enhance the process of assessing writing and speaking skills in the Department of Languages and Cultures (DLC, previously MLES) are described in the following blog entries:

National Union of Students 2017. The ‘Assessment and feedback benchmarking tool’ is available at:

http://tsep.org.uk/wp-content/uploads/2017/07/Assessment-and-feedback-benchmarking-tool.pdf

References

Bloxham, S. 2013. Building ‘standard’ frameworks. The role of guidance and feedback in supporting the achievement of learners. In S. Merry et al. (eds.) 2013. Reconceptualising feedback in Higher Education. Abingdon: Routledge.

Bloxham, S. and Boyd, P. 2007. Developing effective assessment in Higher Education. A practical guide. Maidenhead: McGraw-Hill International.

Bloxham, S., Boyd, P. and Orr, S. 2011. Mark my words: the role of assessment criteria in UK higher education grading practices. Studies in Higher Education 36 (6): 655-670.

Bloxham, S., den-Outer, B., Hudson J. and Price M. 2016. Let’s stop the pretence of consistent marking: exploring the multiple limitations of assessment criteria. Assessment in Higher Education 41 (3): 466-481.

Brooks, V. 2012. Marking as judgement. Research Papers in Education. 27 (1): 63-80.

Gottlieb, D. and Moroye, C. M. 2016. The perceptive imperative: Connoisseurship and the temptation of rubrics. Journal of Curriculum and Pedagogy 13 (2): 104-120.

HEA 2012. A Marked Improvement. Transforming assessment in HE. York: The Higher Education Academy.

Healey, M., Flint, A. and Harrington K. 2014. Engagement through partnership: students as partners in learning and teaching in higher education. York: The Higher Education Academy.

Kramsch, C. 2012. Why is everyone so excited about complexity theory in applied linguistics? Mélanges 33: 9-24.

Larsen-Freeman, D. 2012. The emancipation of the language learner. Studies in Second Language Learning and Teaching. 2(3): 297-309.

Larsen-Freeman, D. 2015a. Saying what we mean: Making a case for ‘language acquisition’ to become ‘language development’. Language Teaching 48 (4): 491-505.

Larsen-Freeman, L. 2015b. Complexity Theory. In VanPatten, B. and Williams, J. (eds.) 2015. Theories in Second Language Acquisition. An Introduction. New York: Routledge: 227-244.

Larsen-Freeman, D. 2017. Just learning. Language Teaching 50 (3): 425-437.

Merry, S., Price, M., Carless, D. and Taras, M. (eds.) 2013. Reconceptualising feedback in Higher Education. Abingdon: Routledge.

O’Donovan, B., Price, M. and Rust, C. 2004. Know what I mean? Enhancing student understanding of assessment standards and criteria. Teaching in Higher Education 9 (3): 325-335.

Price, M. 2005. Assessment standards: the role of communities of practice and the scholarship of assessment. Assessment & Evaluation in Higher Education 30 (3): 215-230.

Sadler, D. R. 2009. Indeterminacy in the use of preset criteria for assessment and grading. Assessment and evaluation in Higher Education 34 (2): 159-179.

Sadler, D. R. 2013. The futility of attempting to codify academic achievement standards. Higher Education 67 (3): 273-288.

Torrance, H. 2007. Assessment as learning? How the use of explicit learning objectives, assessment criteria and feedback in post-secondary education and training can come to dominate learning. Assessment in Education 14 (3): 281-294.

VanPatten & J. Williams (Eds.) 2015. Theories in Second Language Acquisition, 2nd edition. Routledge: 227-244.

Yorke, M. 2011. Summative assessment dealing. Dealing with the ‘Measurement Fallacy’. Studies in Higher Education 36 (3): 251-273.

Reframing Identity 360

Kate Allen, Department of Art, k.allen@reading.ac.uk

Overview

An investigative artwork that explores identity using 360 cameras developed through practical, alumni led workshops and socially engaged art with current art students, school groups and the general public. Part of ArtLab Movement’ at Tate Exchange (TEx) 2019 at the Tate Modern on March and be archived on the ArtLab website.

Objectives

- Contribute to live art event/out-reach work experience led by Alumni at Tate Exchange 1-3 March 2019

- Explore identity capture with 360 cameras

- 360 cameras experimentation including designing, capturing, printing and editing.

- Create portraits with purpleSTARS, people with learning disabilities and children from Widening Participation schools in Reading.

Context

Reframing Identity explored self-portraits in shot in 360, developed as a response to Tania Bruguera’s Turbine Hall Commission concerning institutional power, borders and migration. Can 360 self-portraits raise awareness of how interconnected we are, when no person is ever behind the 360 camera, everyone is included.

Implementation

Alumni and Virtual Reality artist Kassie Headon researched ideas in response to Tania Bruguera installation at Tate Modern inspired by Bruguera’s ideas on inclusion, connecting to Kate Allen’s research with purpleSTARS a group of people with and with learning disabilities who aim to make museums more inclusive. Kassie demonstrated to students and purpleSTARS how to use the GoPro Fusion Camera and the app to edit 360 content. Activities to share the 360 self portrait concept with visitors were developed including drawing cylindrical self-portraits which they could then wear on their heads for a 360 selfie. Students facilitated the Reframing Identity 360 workshop as part of ArtLab Movement at TEx. Using 360 cameras was a new experience and concept for our students and most people visiting the TEx. The 360 self-portraits were exhibited via live video stream from the 360 cameras on an iPad displayed at the Tate and let participants explore the views, which they could manipulate and distort to create the desired effect. Participants 360 self-portraits were also printed or sent to the visitors phone.

Impact

The impact of Reframing Identity 360 created access and inclusion with new technologies for students and the public. Experiencing the live video stream frequently gave visitors an ‘Oh Wow’ moment. TEx gave an opportunity for research led teaching with Dr Allen purpleSTARS, Alumni Kassie Headon and current BA students to explore the concept of 360 self-portraits gain professional practice experience facilitating the workshops and technical skills working, with the 360 camera. The 360 cameras are now part of the digital equipment available to students with a core team of ArtLab students now familiar with their potential and how to use them.

Reflections

Working with new technologies in collaboration with Alumni, ArtLab students and purpleSTARS led to new perspectives on ideas of inclusion and self -portraiture. The experimental research occurred in response to work at the Tate and in collaboration with visitors to TEx. The project built capacity and awareness of new technology being introduced into the Art Dept learning through research and practical experiences the potential to create artworks and inclusive engagements.

Follow up

Kassie Headen continued to work with the 360 camera collaborating with widening participation schools during the ArtLab summer workshops 2019 exploring spaces and manipulating 2d versions of 3d space.

We are developing further research collaborations and research led teaching opportunities for ideas exploring inclusion in museums and immersive virtual reality artworks/experiences using Oculus Rift technology.

Links and References

We created a 360 recording of our Reframing Identity event at the Tate https://www.thinglink.com/mediacard/1158753748827242499?autoplay=0&autorotate=0&displaytitle=1&rel=1

ArtLab documents the workshop

https://readingartlab.com/2019/04/25/artlab-tate-exchange-visual-diary-2nd-and-3rd-march-2019/

purpleSTARS web documentation

https://purplestars.org.uk/2017/11/12/purplestars-at-tate-gallery-2018/

Tate Exchange webpage

https://www.tate.org.uk/whats-on/tate-modern/tate-exchange/workshop/reading-assembly-movement

Engaging Diverse Learning Communities in Partnership: A Case Study Involving Professional Practice Students in Re-designing an Assessment

 

 

 

 

Lucy Hart (student – trainee PWP)- l.hart@student.reading.ac.uk 

Tamara Wiehe (staff – PWP Clinical Educator)- t.wiehe@reading.ac.uk

Charlie Waller Institute, School of Psychology and Clinical Language Sciences

Overview

This case study re-designed an assessment for two Higher Education programmes where students train to become Psychological Wellbeing Practitioners (PWP) in the NHS. The use of remote methods engaged harder to reach students in the re-design of the assessment tool. The project promotes the effectiveness of partnership working across diverse learning communities, by placing student views at the centre of decision making. In line with one of the University’s principles of partnership (2018) – shared responsibility for the process and outcome – this blog has been created by a student involved in the focus group and the member of teaching staff leading the project.

Objectives

  • Improve the design of an assessment across the University’s PWP training programmes.
  • Involve students throughout the re-design process, ensuring student voices and experiences are acknowledged.
  • Implement the new assessment design with the next cohorts.

Context

It was proposed by students in modular feedback and staff in a quarterly meeting that the design of an assessment on the PWP training programmes could be improved. These programmes are grounded in evidence-based, self-reflective and collaborative practice. Therefore, it was appropriate to maintain this style of working throughout the process. This was achieved through the students reflecting on their experiences when generating ideas and reviewing the re-designed assessment.

Implementation

Traditional methods of partnership were not suitable for our students due to the nature of the PWP training programmes. Their week consists of one teaching day running from 9:30-4:30, a study day and three days practising clinically as a trainee PWP in an NHS service. Location was another factor as many of our students commute to University and live closer to their workplace. The use of technology and remote working enabled us to overcome these barriers and work in partnership with our students.

The partnership process followed these three steps:

 

 

 

 

 

 

 

When generating ideas and reviewing the proposed assessment, we, the professional practice students, considered the following points:

  • Assessment design – consistency in using vignettes throughout the course meaning students will be familiar with this method of working. Word limit ensures concise responses.
  • Time frame – the release date of the essay in proportion to the examination date.
  • Feasibility – will there be enough study days to compensate for the change in design allowing trainees to plan their essays.
  • Academic support – opportunities within the academic timetable to provide additional supervision-style sessions later in the module to support students.
  • Learning materials – accessibility to resources on blackboard. Assigning study days to allow planning of essay.

Impact

  • It was agreed that the original ICT would be replaced with written coursework based on a vignette and implemented with our next cohorts.
  • The assessment aligned with the module learning outcomes and student experiences were considered in a meaningful way.
  • Harder to reach students were able to engage in the re-design of the assessment through effective communication methods.

Reflections

Student perspective:

“Being the expert of our experiences, it was refreshing to have our voices and experiences heard. We hope the re-design supports future cohorts and reduces anxieties around managing both university and service-based training. The focus group was a success due to the clear agenda setting and feasibility of remote online working. It can be proposed that a larger focus group would have beneficial during the review stage to remove biases associated with a small sample size.”

Staff perspective:

“Student input allowed us to hear more about their experiences during the training and took a lot of pressure off of staff to always be the ones coming up with solutions. The outcomes have a far reaching impact beyond that of the students and staff on the programme in terms of engaging diverse learning communities in Higher Education and forming more connections between Universities and NHS services. Although inclusivity and diversity was considered throughout, more participants in the virtual focus group would improve this further. Students could also have more power over the creation of the assessment materials themselves. Both of these reflections will inform my professional practice going forwards.”

Using personal capture to support students to learn practical theory outside of the laboratory

Dr Geraldine (Jay) Mulley – School of Biological Sciences  

Overview

I produced four screen casts to encourage students to better prepare for practical classes and to reinforce practical theory taught in class. Approximately 45% of the cohort watched at least some of the video content, mainly in the few days leading up to the practical assessment. The students appreciated the extra resources, and there was a noticeable improvement in module satisfaction scores.

Objectives

  • To provide consistency in delivery of teaching practical theory between groups led by different practical leaders
  • To provide students with engaging resources to use outside of the classroom, to use as preparation tools for practical classes and as revision aids for the Blackboard-­‐based practical assessment

Context

The Part 1 Bacteriology & Virology module includes 12 hours of practical classes designed to teach students key microbiological techniques and theory. I usually begin each practical with a short lecture-­‐style introduction to explain what they need to do and why.  The 3 hr classes are typically very busy, and I have observed that some students feel overwhelmed with “information overload” and find it hard to assimilate the theory, whilst learning the new techniques.  I have had to schedule multiple runs of practical classes to accommodate the large cohort and my colleagues now teach some of the repeat sessions. My aim was to create a series of videos to explain the theoretical background in more detail that students can access outside of the classroom. I hoped this would ensure consistency in what is taught to each group and give the students more time to focus on learning the techniques during the classes. I hoped that they would use the resources both to help prepare for the classes and as a revision aid for the practical assessment

Implementation

I initially tried to record 4 videos by simply recording myself talking through my original PowerPoint presentations that I use in the practical class introductions (i.e. 4 individual videos to cover each of the 4 practical classes). Having started to make the videos, I realised that it was very difficult for me to explain the theory in this format, which was quite surprising given this is how I had been delivering the information up until that point! I therefore adapted the PowerPoint presentations to make videos focusing on each of the experimental themes, talking through what the students will do in the lab week-­‐by-­‐week with an explanation of the theory at appropriate points. I recorded the video tutorials using the Mediasite “slideshow + audio” option and narrated free-­‐style as I would do in a lecture (no script).  When I made a mistake, I paused for a few seconds and then started the sentence again. After finishing the entire recording, I then used the editing feature to cut out the mistakes, which were easy to identify in the audio trace due to the long pauses. I was also able to move slides to the appropriate place if I had poorly timed the slide transitions. Editing each video took around 30 min to 1 hr. I found it relatively easy to record and edit the videos and I became much more efficient after I had recorded the first few videos.

I would have liked to have asked students and other staff to help in the design and production of the videos, but the timing of the Pilot was not conducive to being able to collaborate at the time.

Impact

Mediasite analytics show 45% of the students in the cohort viewed at least some of the resources, and 17% of the cohort viewed each video more than once. Students watched the three shorter videos (3 – 4 min) in their entirety, but the longest video (18 min) showed a drop-­‐off in the number of views after approx. 5 min (Figure 1), and so in future I will limit my videos to 5 min max.

Graph showing how students watched the video

Only a few students viewed videos prior to practical classes; almost all views were in the few days leading up to the practical assessment on Blackboard. This shows that students were using the videos as a revision aid rather than as a preparation tool. This is probably because I uploaded the videos midway through term and by this stage one of the three groups had already completed the 4 practical classes and so I did not want to disadvantage this group by promoting the videos as a preparation tool. It will be interesting if I can encourage students to use it for this purpose next academic year. My expectation was that time spent viewing would directly correlate with practical assessment grades, however there is not a clear linear correlation (Figure 2).

Graph showing use of videos and grades obtained

For some students attending the practical classes and reading the handbook is enough to achieve a good grade. However, students that spent time viewing the videos did get a higher average than those that did not view any (Figure 3), although this probably reflects overall engagement with all the available learning resources.  Responses to the student survey indicated that students felt the videos improved their understanding of the topic and supported them to revise what they had learnt in class at their own pace.

Graph showing video watching and grades obtained

Reflections

The biggest challenge I faced was trying to recruit other colleagues to the pilot during a very busy Autumn term and finding the time to design the videos myself. It would have been helpful to see some examples of how to use personal capture before I started but having participated in the Pilot, I now have more confidence. Once I had experimented with the Mediasite software, I found it quite easy to record the videos and publish to my Blackboard site (with guidance from the excellent support from the TEL team and Blackboard help web pages). I liked the editing tools, although I would very much like the ability to cut and paste different videos together.  The analytics are very useful and much better than the “track users” function in Blackboard. The analytics reinforced the suggestion that students are much more likely to finish watching short videos and I would advise making videos 5 min maximum, ideally 3 min, in length.    My experience of personal capture was incredibly positive, and I will certainly be making more resources for my students for all my modules.

Follow-up

Since making the recordings for the Pilot, I have teamed up with several colleagues in the School of Biological Sciences and will show them how to use Mediasite so that they can make resources for their modules over summer. I have also used the Mediasite software to record microscope training sessions and talks from open days.

Building bridges and smoothing edges

Patrick Finnegan – School of Economics, Politics & International Relations

Overview

My use of the personal capture scheme was intended to enhance our teaching methods within the department. My initial aims of building additional video capture material into the ongoing lecture series did not come through but I was able to use the capture package to engage my students more in the administration of a (then) overly complicated module.

Objectives

  • Initial plan centred on including personal capture on the Army Higher Education Pathway project – this was not possible due to software incompatibility with the Canvas platform used for the project
  • New objectives were based on a different module (The Study of Politics) and improving the student experience on that module
  • Improve the explanation of methods
  • Explain the supervisory choice system
  • Enhance lectures on complicated topics

Context

The module I focused on was Po2SOP (The Study of Politics) with 160 students. Personal capture was needed on this project as it allowed myself, as convenor of our largest module, to communicate with all of my students in a more engaging way. We needed a way to bring the topic to life and ensure that the students took on board the lessons we needed them to. I wanted to include real examples of the methods in action and to use the screen casts to explain certain decisions that would be too difficult to do via email.

Implementation

Unfortunately, the project began too late in the term to really affect the lectures on this module, which is co-taught between several staff members often using pre-existing slides. However, I was able to use it to engage in discussion with students to explain issues such as supervisor reallocation during the year and how our special event – the mini-conference – was to work. Rather than writing lengthy emails, I was able to quickly and visually explain to he students what was happening and to invite their responses, which some did. They did not engage with the capture material so to speak but my use of it did encourage discussion as to how they would like to see it used in future and how they would like to receive feedback on assessments in future if audio/visual options were available. The recordings made by myself and my colleague were mainly PowerPoint voice-overs or were direct to camera discussions. This allowed us to present the students with illustrations and ‘first hand’ information. These required significant editing to make sure they were suitable but the final product was satisfactory.

Impact

Beyond ‘ease of life’ effects this year, there was not a great deal of impact but this was expected given the start date (the largest number of views in a video was 86, but this was an admin explanation style video). However, planning for next year has already incorporated the different potential advantages provided by personal capture. For example, the same methods module will now incorporate tutorial videos made within the department and will maintain some supervisor ‘adverts’ to allow students to better choose which member of staff they will seek to work with in future. Within other modules, some staff members will be taking the opportunity to build in some flipped classroom style teaching and other time-heavy elements that were not previously available to them.

Reflections

Time needed to organise and direct co-pilots within a teaching-heavy department needed to be a lot greater than I originally planned. I was also not expecting to meet the levels of resistance that I did from some more established staff who were not interested in changing how they delivered the material they had prepared earlier. The major difference I would include going forward would be to focus on upcoming modules rather than pre-existing as incorporating the material when the module has already started was too difficult.

Follow-up

I have started to prepare some videos on material I know will be needed in the future, this is relatively straight forward to do and will mimic the general practice to date. The main evolution will be seen in responses to student need during class and how screen casts can be made on demand and with consistent quality.

Creating screencast videos to support and engage post-graduate students

Sue Blackett – Henley Business School, 2018-19

Image of Sue Blackett

Overview

I participated in the university’s Personal Capture pilot as a Champion for my school to trial the Mediasite tool to create screen cast videos for use in teaching and learning. My aim was to help PGT students get to grips with key elements of the module. The videos facilitated students in repeatedly viewing content with the aim of increasing engagement with the module. Some videos were watched multiple times at different points throughout the term indicating that information needed to be refreshed. 

Objectives

  1. To connect with the cohort and establish module expectations. 
  2. Reduce class time taken up with module administration. 
  3. Provide coursework feedback in an alternative form and reinforce its feedforward use for the final exam. 
  4. To provide exam revision advice and highlight areas of focus. 
  5. Support students with weaker English language skills. 
  6. Provide module materials in a reusable, accessible and alternative form. 

Context

The target audience was students on ACM003 Management Accounting Theory & Practice, a postgraduate course where 91% of students were native Mandarin speakers. English language skills were an issue for some students, so capture video provided opportunities for students to re-watch and get to grips with the content at their leisure. In addition, I wanted to free up class contact time so I could focus on content in areas that had been more challenging on the previous run of the module. Also, by using different colours and font sizes on the PowerPoint slides, the visual emphasis of key points reinforced the accompanying audio. 

Implementation

The first video recorded was a welcome to the module video (slides and audio only) that covered the module administration i.e. an overview of module, outline of assessment, key dates, module text book etc. The content for the video was relatively straightforward as it was taken out of the first lecture’s slides. By isolating module admin information, more information could be added e.g. mapping assessable learning outcomes to assessments and explaining the purpose of each type of assessment. In first recording the video, I did not follow a script as I was trying to make my delivery sound more natural. Instead, I made short notes on slides that needed extra information and printed off the presentation as slides with notes. As this is the same strategy that I use to deliver lectures, I was less concerned about being “audio ready” i.e. not making errors in my voice recording. 

 In the second and third videos (coursework feedback and exam revision advice), I included video of myself delivering the presentations. As the recordings were made in my home office, additional visual matters had to be considered. These included: what I was wearing, the background behind me, looking into the camera, turning pages, etc. The second attempts of each recording were much more fluent and therefore uploaded to Blackboard. 

 The last two recordings were quite different in nature. The coursework feedback used visuals of bar charts and tables to communicate statistics accompanied by audio that focused on qualitative feedback. The exam revision video used lots narrative bullet points. 

Examples of my videos:

Welcome to module: https://uor.mediasite.com/Mediasite/Play/7a7f676595c84507aa31aafe994f2f071d

Assessed coursework feedback: https://uor.mediasite.com/Mediasite/Play/077e974725f44cc8b0debd6361aaaba71d

Exam revision advice: https://uor.mediasite.com/Mediasite/Play/94e4156753c848dbafc3b5e75a9c3d441d

Resit Exam Advice: https://uor.mediasite.com/Mediasite/Play/e8b88b44a7724c5aa4ef8def412c22fd1d

Impact

The welcome video did have impact as it was the only source of information about the administration for the course. When students arrived at the first class with the text book, this indicated that they had been able to access the information they needed to prepare for the course. Student response to the personal capture pilot project questionnaire was low (18%), however, the general feedback was that the videos were useful in supporting them during the course. 

 Analysis of analytics via MediaSite and Blackboard provided some very interesting insights: 

  1. Most students did not watch the videos as soon as they were released. 
  2. Some of the videos were watched multiple times throughout the term by weaker and stronger students. 
  3. Some students were not recorded as having accessed the videos. 
  4. Students were focused for the first 20 – 60 seconds of each video and then skipped through the videos. 
  5. Few students watched the videos from start to finish i.e. the average time watched for the 4 min 49 secs welcome video was 2 min 10 secs. The coursework feedback video was 9 mins 21 secs, however, average viewing time was 3 mins 11 secs. The revision video followed the same trend being 8 mins 41 secs long with an average watching time of 2 mins 55 secs.
     

Review of video along with watching trends showed that students skipped through the videos to the points where slides changed. This suggested that the majority were reading the slides rather than listening to the accompanying commentary which contained supplementary information. 

 As no student failed to meet the admin expectations of the course, those that had not watched the video must have been informed by those who had. 

Reflections

The analytics were most illuminating. Me appearing in videos was supposed to establish bonds with the cohort and increase engagement, however, my appearance seemed to be irrelevant as the students were focused on reading rather than listening. This could have been due to weaker listening skills but also highlights that students might think that all important information is written down rather than spoken.  

 Videos with graphics were more watched than those without so my challenge will be to think about what content I include in slides i.e. more graphics with fewer words and/or narrative slides with no audio. 

 I will continue with capture videos, however, I will do more to test their effectiveness, for example I will design in-class quizzes using KahootMentimeter, etc. to test whether the content of the videos has been internalised. 

Follow-up

I’ve become much quicker at designing the PowerPoint content and less worried about stumbling or searching for the right words to use. I have been able to edit videos more quickly e.g. cutting out excessive time, cropping the end of the video. Embedding videos in Blackboard has also become easier the more I’ve done  it. The support information was good, however, I faced  a multitude of problems that IT Support had to help me with, which, if I’m honest, was putting me off using the tool  (I’m a Mac user mostly using this tool off campus).  

 

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 2)

Dr Madeleine Davies and Michael Lyons, School of Literature and Languages

Overview

The Department of English Literature (DEL) has run two student focus groups and two whole-cohort surveys as part of our Teaching and Learning Development Fund‘Diversifying Assessments’ project. This is the second of two T&L Exchange entries on this topic. Click here for the first entry which outlines how the feedback received from students indicates that their module selection is informed by the assessment models that are used by individual modules. Underpinning these decisions is an attempt to avoid the ‘stress and anxiety’ that students connect with exams. The surprise of this second round of focus groups and surveys is the extent to which this appears to dominate students’ teaching and learning choices.

Objectives

  • The focus groups and surveys are used to gain feedback from DEL students about possible alternative forms of summative assessment to our standard assessed essay + exam model. This connects with the Curriculum Framework in its emphasis on Programme Review and also with the aims of the Assessment Project.
  • These forms of conversations are designed to discover student views on the problems with existing assessment patterns and methods, as well as their reasons for preferring alternatives to them.
  • The conversations are also being used to explore the extent to which electronic methods of assessment can address identified assessment problems.

Context

Having used focus groups and surveys to provide initial qualitative data on our assessment practices, we noticed a widespread preference for alternatives to traditional exams (particularly the Learning Journal), and decided to investigate the reasons for this further. The second focus group and subsequent survey sought to identify why the Learning Journal in particular is so favoured by students, and we were keen to explore whether teaching and learning aims were perceived by students to be better achieved via this method than by the traditional exam. We also took the opportunity to ask students what they value most in feedback: the first focus group and survey had touched on this but we decided this time to give students the opportunity to select four elements of feedback which they could rank in order or priority. This produced more nuanced data.

Implementation

  • A second focus group was convened to gather more detailed views on the negative attitudes towards exams, and to debate alternatives to this traditional assessment method.
  • A series of questions was asked to generate data and dialogue.
  • A Survey Monkey was circulated to all DEL students with the same series of questions as those used for the focus group in order to determine whether the focus group’s responses were representative of the wider cohort.
  •  The Survey Monkey results are presented below. The numbers refer to student responses to a category (eg. graphic 1, 50 students selected option (b). Graphic 2 and graphic 5 allowed students to rank their responses in order or priority.

Results

  • Whilst only 17% in the focus group preferred to keep to the traditional exam + assessed essay method, the survey found the aversion to exams to be more prominent. 88% of students preferred the Learning Journal over the exam, and 88% cited the likelihood of reducing stress and anxiety as a reason for this preference.
  • Furthermore, none of the survey respondents wanted to retain the traditional exam + assessed essay method, and 52% were in favour of a three-way split between types of assessment; this reflects a desire for significant diversity in assessment methods.
  • We find it helpful to know precisely what students want in terms of feedback: ‘a clear indication of errors and potential solutions’ was the overwhelming response. ‘Feedback that intersects with the Module Rubric’ was the second highest scorer (presumably a connection between the two was identified by students).
  • The students in the focus group mentioned a desire to choose assessment methods within modules on an individual basis. This may be one issue in which student choice and pedagogy may not be entirely compatible (see below).
  • Assessed Essay method: the results seem to indicate that replacing an exam with a second assessed essay is favoured across the Programme rather than being pinned to one Part.

Reflections

The results in the ‘Feedback’ sections are valuable for DEL: they indicate that clarity, diagnosis, and solutions-focused comments are key. In addressing our feedback conventions and practices, this input will help us to reflect on what we are doing when we give students feedback on their work.

The results of the focus group and of the subsequent survey do, however, raise some concerns about the potential conflict between ‘student choice’ and pedagogical practice. Students indicate that they not only want to avoid exams because of ‘stress’, but that they would also like to be able to select assessment methods within modules. This poses problems because marks are in part produced ‘against’ the rest of the batch: if the ‘base-line’ is removed by allowing students to choose assessment models, we would lack one of the main indicators of level.

In addition, the aims of some modules are best measured using exams. Convenors need to consider whether a student’s work can be assessed in non-exam formats but, if an exam is the best test of teaching and learning, it should be retained, regardless of student choice.

If, however, students overwhelmingly choose non-exam-based modules, this would leave modules retaining an exam in a vulnerable position. The aim of this project is to find ways to diversify our assessments, but this could leave modules that retain traditional assessment patterns vulnerable to students deselecting them. This may have implications for benchmarking.

It may also be the case that the attempt to avoid ‘stress’ is not necessarily in students’ best interests. The workplace is not a stress-free zone and it is part of the university’s mission to produce resilient, employable graduates. Removing all ‘stress’ triggers may not be the best way to achieve this.

Follow up

  • DEL will convene a third focus group meeting in the Spring Term.
  • The co-leaders of the ‘Diversifying Assessments’ project will present the findings of the focus groups and surveys to DEL in a presentation. We will outline the results of our work and call on colleagues to reflect on the assessment models used on their modules with a view to volunteering to adopt different models if they think this appropriate to the teaching and learning aims of their modules
  • This should produce an overall assessment landscape that corresponds to students’ request for ‘three-way’ (at least) diversification of assessment.
  • The new landscape will be presented to the third focus group for final feedback.

Links

With thanks to Lauren McCann of TEL for sending me the first link which includes a summary of students’ responses to various types of ‘new’ assessment formats.

https://www.facultyfocus.com/articles/online-education/assessment-strategies-students-prefer/

Conclusions (May 2018)

The ‘Diversifying Assessment in DEL’ TLDF Mini-Project revealed several compelling reasons for reflecting upon assessment practice within a traditional Humanities discipline (English Literature):

  1. Diversified cohort: HEIs are recruiting students from a wide variety of socio-cultural, economic and educational backgrounds and assessment practice needs to accommodate this newly diversified cohort.
  2. Employability: DEL students have always acquired advanced skills in formal essay-writing but graduates need to be flexible in terms of their writing competencies. Diversifying assessment to include formats involving blog-writing, report-writing, presentation preparation, persuasive writing, and creative writing produces agile students who are comfortable working within a variety of communication formats.
  3. Module specific attainment: the assessment conventions in DEL, particularly at Part 2, have a standardised assessment format (33% assessed essay and 67% exam). The ‘Diversifying Assessment’ project revealed the extent to which module leaders need to reflect on the intended learning outcomes of their modules and to design assessments that are best suited to the attainment of them.
  4. Feedback: the student focus groups convened for the ‘Diversifying Assessment’ project returned repeatedly to the issue of feedback. Conversations about feedback will continue in DEL, particularly in relation to discussions around the Curriculum Framework.
  5. Digitalisation: eSFG (via EMA) has increased the visibility of a variety of potential digital assessment formats (for example, Blackboard Learning Journals, Wikis and Blogs). This supports diversification of assessment and it also supports our students’ digital skills (essential for employability).
  6. Student satisfaction: while colleagues should not feel pressured by student choice (which is not always modelled on academic considerations), there is clearly a desire among our students for more varied methods of assessment. One Focus Group student argued that fees had changed the way students view exams: students’ significant financial investment in their degrees has caused exams to be considered unacceptably ‘high risk’. The project revealed the extent to which Schools need to reflect on the many differences made by the new fees landscape, most of which are invisible to us.
  7. Focus Groups: the Project demonstrated the value of convening student focus groups and of listening to students’ attitudes and responses.
  8. Impact: one Part 2 module has moved away from an exam and towards a Learning Journal as a result of the project and it is hoped that more Part 2 module convenors will similarly decide to reflect on their assessment formats. The DEL project will be rolled out School-wide in the next session to encourage further conversations about assessment, feedback and diversification. It is hoped that these actions will contribute to Curriculum Framework activity in DEL and that they will generate a more diversified assessment landscape in the School.

Rethinking assessment design, to improve the student/staff experience when dealing with video submissions

Rachel Warner, School of Arts and Communication Design

Rachel.Warner@pgr.reading.ac.uk

Jacqueline Fairbairn, Centre for Quality Support and Development

j.fairbairn@reading.ac.uk

Overview

Rachel in Typography and Graphic Communication (T&GC) worked with the Technology Enhanced Learning (TEL) team to rethink an assignment workflow, to improve the student/staff experience when dealing with video submissions. Changes were made to address student assessment literacies, develop articulation skills, support integration between practice and reflection, and make use of OneDrive to streamline the archiving and sharing of video submissions via Blackboard.

This work resulted in students developing professional ‘work skills’ through the assessment process and the production of a toolkit to support future video assessments.

Objectives

  • Improve staff and student experiences when dealing with video assignment submissions. Specifically, streamlining workflows by improving student assessment literacy and making use of university OneDrive accounts.
  • Support students to develop professional skills for the future, through assessment design (developing digital literacies and communication skills).
  • Provide an authentic assessment experience, in which students self-select technologies (choosing software and a task to demonstrate) to answer a brief.

Context

The activity was undertaken for Part 1 students learning skills in design software (e.g. Adobe Creative apps). The assignment required students to submit a ‘screencast’ video recording that demonstrated a small task using design software.

Rachel wanted to review the process for submitting video work for e-assessment, and find ways to streamline the time intensive marking process, particularly in accessing and reviewing video files, without compromising good assessment practice. This is also acknowledged by Jeanne-Louise Moys, T&GC’s assessment and feedback champion: “Video submissions help our students directly demonstrate the application of knowledge and creative thinking to their design and technical decisions. They can be time-consuming to mark so finding ways to streamline this process is a priority given our need to maintain quality practices while adapting to larger cohorts.’”

The TEL team was initially consulted to explore processes for handling video submissions in Blackboard, and to discuss implications on staff time (in terms of supporting students, archiving material and accessing videos for marking). Designing formative support and improving the assessment literacy of students was also a key driver to reduce the number of queries and technical issues when working with video technologies.

Implementation

Rachel consulted TEL, to discuss:

  • balancing the pedagogic implications of altering the assignment
  • technical implications, such as submission to Blackboard and storage of video

To address the issue of storing video work, students were asked make use of OneDrive areas to store and submit work (via ‘share’ links). Use of OneDrive encouraged professional behaviours such as adopting a systematic approach to file naming, and it meant the videos were securely stored on university systems using a well-recognised industry standard platform.

To further encourage professional working, students were required to create a social media account to share their video. YouTube was recommended; it is used prolifically by designers to showcase work and portfolios, and across wider professional settings.

Students were provided with a digital coversheet to submit URLs for both the OneDrive and YouTube videos.

The most effective intervention was the introduction of a formative support session (1.5hr). Students practiced using their OneDrive area, set up YouTube accounts and reviewed examples of screencasts. This workshop supported students to understand the professional skills that could be developed through this medium. The session introduced the assessment requirements, toolkit, digital coversheet and allowed students to explore the technologies in a supported manner (improving students’ assessment literacy!)

The assignment instructions were strategically revised, to include information (‘hints and tips’) to support the students’ development of higher production values and other associated digital literacies for the workplace (such as file naming conventions, digital workflows, and sourcing online services).

Students were provided with the option to self-select recording/editing software to undertake the screencast video. Recommended tools were suggested, that are free to use and which students could explore. ‘Screencast-o-matic’ and ‘WeVideo’ provide basic to intermediate options.

Impact

Marking the submissions was made easier by the ability to access videos through a consistent format, using a clearly structured submission process (digital coversheet). The ability to play URL links directly through OneDrive meant Rachel was able to store copies of the videos into a central area for future reference. Students also provided a written summary of their video, highlighting key video timings that demonstrate marking criteria (so the marker does not have to watch whole video).

Rachel rationalised her approach to marking by developing a spreadsheet, which allowed her to effectively cross reference feedback against the assessment criteria (in the form of a rubric) and between assignments. This greatly speeded up the marking workflow and allowed Rachel to identify patterns in students work, where common feedback statements could be applied, as appropriate.

The assessment highlighted gaps in students existing digital literacies. The majority of students had not made a video recording before and many were apprehensive about speaking into a microphone. After the completion of the screencasts, previously unconfident students noted in their module reflections that the screencast task had developed their confidence to communicate and explore a new technology.

Reflections

The modifications to the assessment:

  • Reflected professional digital competencies required of the discipline;
  • Allowed students to explore a new technology and way of working in a supported context; and,
  • Built confidence, facilitated assessment literacy, and encouraged reflection.

Future modifications to the screencast submission:

  • Peer review could be implemented, asking students to upload videos to a shared space for formative feedback (such as Facebook or a Blackboard discussion board).
  • The digital coversheet had to be downloaded to access URL links. In future, students could paste into the submission comment field, for easier access when marking.
  • Rachel is developing a self-assessment checklist to help students reflect on the production values of their work. The summative assessment rubric is focused on video content, not production values, however, it would be useful for students to get feedback on professional work skills. For example, communication skills and use of narrative devices which translate across other graphic mediums.

Toolkit basics:

a thumbnail image of a toolkit document, full access available via links in webpage

  • Outline task expectations and software options, give recommendations
  • Source examples of screencasts from your industry, discuss with students.
  • Provide hints and tips for creating effective screencasts.
  • Provide submission text. Consider asking students to use the ‘submission comment’ field to paste links to their work, for quick marker access to URLs.
  • Plan a formative workshop session, to practice using the software and go through the submission process (time invested here is key!).
  • Create a self-assessment checklist, to enhance the production quality of videos and highlight transferrable skills that can be developed by focusing on the quality of the production.
  • Consider creating a shared online space for formative peer-feedback (e.g. Blackboard discussion forum).
  • Consider using a marking spreadsheet to cross-reference feedback and highlight good examples of screencasts that can be utilised in other teaching.

Links

Screencast example: (YouTube link) This screencast was altered and improved after submission and marking, taking onboard feedback from the assessment and module. The student noted ‘After submission, I reflected on my screencast, and I changed the original image because it was too complex to fit into the short time that I had available in the screencast. I wanted to use the screencast to show a skill that I had learned and the flower was simple enough to showcase this’. Part of the module was to be reflective and learn from ‘doing’, this screencast is an example of a student reflecting on their work and improving their skills after the module had finished.

Screencast example: (YouTube link) This screencast was a clear and comprehensive demonstration of a technique in PhotoShop that requires multiple elements to achieve results. It has a conclusion that demonstrates the student’s awareness that the technique is useful in other scenarios, other than the one demonstrated, giving the listener encouragement to continue learning. The student has used an intro slide and background music, demonstrating exploration with the screencast software alongside compiling their demonstration.

Screencast example: (YouTube link) This demonstrates a student who is competent in a tool, able to use their own work (work from another module on the course) to demonstrate a task, and additionally includes their research into how the tool can be used for other tasks.

Other screencast activity from the Typography & Graphic Communication department from the GRASS project:  (Blog post) Previous project for Part 1s that included use of screencasts to demonstrate students’ achievements of learning outcomes.