Using personal capture to support students to learn practical theory outside of the laboratory

Dr Geraldine (Jay) Mulley – School of Biological Sciences  

Overview

I produced four screen casts to encourage students to better prepare for practical classes and to reinforce practical theory taught in class. Approximately 45% of the cohort watched at least some of the video content, mainly in the few days leading up to the practical assessment. The students appreciated the extra resources, and there was a noticeable improvement in module satisfaction scores.

Objectives

  • To provide consistency in delivery of teaching practical theory between groups led by different practical leaders
  • To provide students with engaging resources to use outside of the classroom, to use as preparation tools for practical classes and as revision aids for the Blackboard-­‐based practical assessment

Context

The Part 1 Bacteriology & Virology module includes 12 hours of practical classes designed to teach students key microbiological techniques and theory. I usually begin each practical with a short lecture-­‐style introduction to explain what they need to do and why.  The 3 hr classes are typically very busy, and I have observed that some students feel overwhelmed with “information overload” and find it hard to assimilate the theory, whilst learning the new techniques.  I have had to schedule multiple runs of practical classes to accommodate the large cohort and my colleagues now teach some of the repeat sessions. My aim was to create a series of videos to explain the theoretical background in more detail that students can access outside of the classroom. I hoped this would ensure consistency in what is taught to each group and give the students more time to focus on learning the techniques during the classes. I hoped that they would use the resources both to help prepare for the classes and as a revision aid for the practical assessment

Implementation

I initially tried to record 4 videos by simply recording myself talking through my original PowerPoint presentations that I use in the practical class introductions (i.e. 4 individual videos to cover each of the 4 practical classes). Having started to make the videos, I realised that it was very difficult for me to explain the theory in this format, which was quite surprising given this is how I had been delivering the information up until that point! I therefore adapted the PowerPoint presentations to make videos focusing on each of the experimental themes, talking through what the students will do in the lab week-­‐by-­‐week with an explanation of the theory at appropriate points. I recorded the video tutorials using the Mediasite “slideshow + audio” option and narrated free-­‐style as I would do in a lecture (no script).  When I made a mistake, I paused for a few seconds and then started the sentence again. After finishing the entire recording, I then used the editing feature to cut out the mistakes, which were easy to identify in the audio trace due to the long pauses. I was also able to move slides to the appropriate place if I had poorly timed the slide transitions. Editing each video took around 30 min to 1 hr. I found it relatively easy to record and edit the videos and I became much more efficient after I had recorded the first few videos.

I would have liked to have asked students and other staff to help in the design and production of the videos, but the timing of the Pilot was not conducive to being able to collaborate at the time.

Impact

Mediasite analytics show 45% of the students in the cohort viewed at least some of the resources, and 17% of the cohort viewed each video more than once. Students watched the three shorter videos (3 – 4 min) in their entirety, but the longest video (18 min) showed a drop-­‐off in the number of views after approx. 5 min (Figure 1), and so in future I will limit my videos to 5 min max.

Graph showing how students watched the video

Only a few students viewed videos prior to practical classes; almost all views were in the few days leading up to the practical assessment on Blackboard. This shows that students were using the videos as a revision aid rather than as a preparation tool. This is probably because I uploaded the videos midway through term and by this stage one of the three groups had already completed the 4 practical classes and so I did not want to disadvantage this group by promoting the videos as a preparation tool. It will be interesting if I can encourage students to use it for this purpose next academic year. My expectation was that time spent viewing would directly correlate with practical assessment grades, however there is not a clear linear correlation (Figure 2).

Graph showing use of videos and grades obtained

For some students attending the practical classes and reading the handbook is enough to achieve a good grade. However, students that spent time viewing the videos did get a higher average than those that did not view any (Figure 3), although this probably reflects overall engagement with all the available learning resources.  Responses to the student survey indicated that students felt the videos improved their understanding of the topic and supported them to revise what they had learnt in class at their own pace.

Graph showing video watching and grades obtained

Reflections

The biggest challenge I faced was trying to recruit other colleagues to the pilot during a very busy Autumn term and finding the time to design the videos myself. It would have been helpful to see some examples of how to use personal capture before I started but having participated in the Pilot, I now have more confidence. Once I had experimented with the Mediasite software, I found it quite easy to record the videos and publish to my Blackboard site (with guidance from the excellent support from the TEL team and Blackboard help web pages). I liked the editing tools, although I would very much like the ability to cut and paste different videos together.  The analytics are very useful and much better than the “track users” function in Blackboard. The analytics reinforced the suggestion that students are much more likely to finish watching short videos and I would advise making videos 5 min maximum, ideally 3 min, in length.    My experience of personal capture was incredibly positive, and I will certainly be making more resources for my students for all my modules.

Follow-up

Since making the recordings for the Pilot, I have teamed up with several colleagues in the School of Biological Sciences and will show them how to use Mediasite so that they can make resources for their modules over summer. I have also used the Mediasite software to record microscope training sessions and talks from open days.

Building bridges and smoothing edges

Patrick Finnegan – School of Economics, Politics & International Relations

Overview

My use of the personal capture scheme was intended to enhance our teaching methods within the department. My initial aims of building additional video capture material into the ongoing lecture series did not come through but I was able to use the capture package to engage my students more in the administration of a (then) overly complicated module.

Objectives

  • Initial plan centred on including personal capture on the Army Higher Education Pathway project – this was not possible due to software incompatibility with the Canvas platform used for the project
  • New objectives were based on a different module (The Study of Politics) and improving the student experience on that module
  • Improve the explanation of methods
  • Explain the supervisory choice system
  • Enhance lectures on complicated topics

Context

The module I focused on was Po2SOP (The Study of Politics) with 160 students. Personal capture was needed on this project as it allowed myself, as convenor of our largest module, to communicate with all of my students in a more engaging way. We needed a way to bring the topic to life and ensure that the students took on board the lessons we needed them to. I wanted to include real examples of the methods in action and to use the screen casts to explain certain decisions that would be too difficult to do via email.

Implementation

Unfortunately, the project began too late in the term to really affect the lectures on this module, which is co-taught between several staff members often using pre-existing slides. However, I was able to use it to engage in discussion with students to explain issues such as supervisor reallocation during the year and how our special event – the mini-conference – was to work. Rather than writing lengthy emails, I was able to quickly and visually explain to he students what was happening and to invite their responses, which some did. They did not engage with the capture material so to speak but my use of it did encourage discussion as to how they would like to see it used in future and how they would like to receive feedback on assessments in future if audio/visual options were available. The recordings made by myself and my colleague were mainly PowerPoint voice-overs or were direct to camera discussions. This allowed us to present the students with illustrations and ‘first hand’ information. These required significant editing to make sure they were suitable but the final product was satisfactory.

Impact

Beyond ‘ease of life’ effects this year, there was not a great deal of impact but this was expected given the start date (the largest number of views in a video was 86, but this was an admin explanation style video). However, planning for next year has already incorporated the different potential advantages provided by personal capture. For example, the same methods module will now incorporate tutorial videos made within the department and will maintain some supervisor ‘adverts’ to allow students to better choose which member of staff they will seek to work with in future. Within other modules, some staff members will be taking the opportunity to build in some flipped classroom style teaching and other time-heavy elements that were not previously available to them.

Reflections

Time needed to organise and direct co-pilots within a teaching-heavy department needed to be a lot greater than I originally planned. I was also not expecting to meet the levels of resistance that I did from some more established staff who were not interested in changing how they delivered the material they had prepared earlier. The major difference I would include going forward would be to focus on upcoming modules rather than pre-existing as incorporating the material when the module has already started was too difficult.

Follow-up

I have started to prepare some videos on material I know will be needed in the future, this is relatively straight forward to do and will mimic the general practice to date. The main evolution will be seen in responses to student need during class and how screen casts can be made on demand and with consistent quality.

Creating screencast videos to support and engage post-graduate students

Sue Blackett – Henley Business School, 2018-19

Image of Sue Blackett

Overview

I participated in the university’s Personal Capture pilot as a Champion for my school to trial the Mediasite tool to create screen cast videos for use in teaching and learning. My aim was to help PGT students get to grips with key elements of the module. The videos facilitated students in repeatedly viewing content with the aim of increasing engagement with the module. Some videos were watched multiple times at different points throughout the term indicating that information needed to be refreshed. 

Objectives

  1. To connect with the cohort and establish module expectations. 
  2. Reduce class time taken up with module administration. 
  3. Provide coursework feedback in an alternative form and reinforce its feedforward use for the final exam. 
  4. To provide exam revision advice and highlight areas of focus. 
  5. Support students with weaker English language skills. 
  6. Provide module materials in a reusable, accessible and alternative form. 

Context

The target audience was students on ACM003 Management Accounting Theory & Practice, a postgraduate course where 91% of students were native Mandarin speakers. English language skills were an issue for some students, so capture video provided opportunities for students to re-watch and get to grips with the content at their leisure. In addition, I wanted to free up class contact time so I could focus on content in areas that had been more challenging on the previous run of the module. Also, by using different colours and font sizes on the PowerPoint slides, the visual emphasis of key points reinforced the accompanying audio. 

Implementation

The first video recorded was a welcome to the module video (slides and audio only) that covered the module administration i.e. an overview of module, outline of assessment, key dates, module text book etc. The content for the video was relatively straightforward as it was taken out of the first lecture’s slides. By isolating module admin information, more information could be added e.g. mapping assessable learning outcomes to assessments and explaining the purpose of each type of assessment. In first recording the video, I did not follow a script as I was trying to make my delivery sound more natural. Instead, I made short notes on slides that needed extra information and printed off the presentation as slides with notes. As this is the same strategy that I use to deliver lectures, I was less concerned about being “audio ready” i.e. not making errors in my voice recording. 

 In the second and third videos (coursework feedback and exam revision advice), I included video of myself delivering the presentations. As the recordings were made in my home office, additional visual matters had to be considered. These included: what I was wearing, the background behind me, looking into the camera, turning pages, etc. The second attempts of each recording were much more fluent and therefore uploaded to Blackboard. 

 The last two recordings were quite different in nature. The coursework feedback used visuals of bar charts and tables to communicate statistics accompanied by audio that focused on qualitative feedback. The exam revision video used lots narrative bullet points. 

Examples of my videos:

Welcome to module: https://uor.mediasite.com/Mediasite/Play/7a7f676595c84507aa31aafe994f2f071d

Assessed coursework feedback: https://uor.mediasite.com/Mediasite/Play/077e974725f44cc8b0debd6361aaaba71d

Exam revision advice: https://uor.mediasite.com/Mediasite/Play/94e4156753c848dbafc3b5e75a9c3d441d

Resit Exam Advice: https://uor.mediasite.com/Mediasite/Play/e8b88b44a7724c5aa4ef8def412c22fd1d

Impact

The welcome video did have impact as it was the only source of information about the administration for the course. When students arrived at the first class with the text book, this indicated that they had been able to access the information they needed to prepare for the course. Student response to the personal capture pilot project questionnaire was low (18%), however, the general feedback was that the videos were useful in supporting them during the course. 

 Analysis of analytics via MediaSite and Blackboard provided some very interesting insights: 

  1. Most students did not watch the videos as soon as they were released. 
  2. Some of the videos were watched multiple times throughout the term by weaker and stronger students. 
  3. Some students were not recorded as having accessed the videos. 
  4. Students were focused for the first 20 – 60 seconds of each video and then skipped through the videos. 
  5. Few students watched the videos from start to finish i.e. the average time watched for the 4 min 49 secs welcome video was 2 min 10 secs. The coursework feedback video was 9 mins 21 secs, however, average viewing time was 3 mins 11 secs. The revision video followed the same trend being 8 mins 41 secs long with an average watching time of 2 mins 55 secs.
     

Review of video along with watching trends showed that students skipped through the videos to the points where slides changed. This suggested that the majority were reading the slides rather than listening to the accompanying commentary which contained supplementary information. 

 As no student failed to meet the admin expectations of the course, those that had not watched the video must have been informed by those who had. 

Reflections

The analytics were most illuminating. Me appearing in videos was supposed to establish bonds with the cohort and increase engagement, however, my appearance seemed to be irrelevant as the students were focused on reading rather than listening. This could have been due to weaker listening skills but also highlights that students might think that all important information is written down rather than spoken.  

 Videos with graphics were more watched than those without so my challenge will be to think about what content I include in slides i.e. more graphics with fewer words and/or narrative slides with no audio. 

 I will continue with capture videos, however, I will do more to test their effectiveness, for example I will design in-class quizzes using KahootMentimeter, etc. to test whether the content of the videos has been internalised. 

Follow-up

I’ve become much quicker at designing the PowerPoint content and less worried about stumbling or searching for the right words to use. I have been able to edit videos more quickly e.g. cutting out excessive time, cropping the end of the video. Embedding videos in Blackboard has also become easier the more I’ve done  it. The support information was good, however, I faced  a multitude of problems that IT Support had to help me with, which, if I’m honest, was putting me off using the tool  (I’m a Mac user mostly using this tool off campus).  

 

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 2)

Dr Madeleine Davies and Michael Lyons, School of Literature and Languages

Overview

The Department of English Literature (DEL) has run two student focus groups and two whole-cohort surveys as part of our Teaching and Learning Development Fund‘Diversifying Assessments’ project. This is the second of two T&L Exchange entries on this topic. Click here for the first entry which outlines how the feedback received from students indicates that their module selection is informed by the assessment models that are used by individual modules. Underpinning these decisions is an attempt to avoid the ‘stress and anxiety’ that students connect with exams. The surprise of this second round of focus groups and surveys is the extent to which this appears to dominate students’ teaching and learning choices.

Objectives

  • The focus groups and surveys are used to gain feedback from DEL students about possible alternative forms of summative assessment to our standard assessed essay + exam model. This connects with the Curriculum Framework in its emphasis on Programme Review and also with the aims of the Assessment Project.
  • These forms of conversations are designed to discover student views on the problems with existing assessment patterns and methods, as well as their reasons for preferring alternatives to them.
  • The conversations are also being used to explore the extent to which electronic methods of assessment can address identified assessment problems.

Context

Having used focus groups and surveys to provide initial qualitative data on our assessment practices, we noticed a widespread preference for alternatives to traditional exams (particularly the Learning Journal), and decided to investigate the reasons for this further. The second focus group and subsequent survey sought to identify why the Learning Journal in particular is so favoured by students, and we were keen to explore whether teaching and learning aims were perceived by students to be better achieved via this method than by the traditional exam. We also took the opportunity to ask students what they value most in feedback: the first focus group and survey had touched on this but we decided this time to give students the opportunity to select four elements of feedback which they could rank in order or priority. This produced more nuanced data.

Implementation

  • A second focus group was convened to gather more detailed views on the negative attitudes towards exams, and to debate alternatives to this traditional assessment method.
  • A series of questions was asked to generate data and dialogue.
  • A Survey Monkey was circulated to all DEL students with the same series of questions as those used for the focus group in order to determine whether the focus group’s responses were representative of the wider cohort.
  •  The Survey Monkey results are presented below. The numbers refer to student responses to a category (eg. graphic 1, 50 students selected option (b). Graphic 2 and graphic 5 allowed students to rank their responses in order or priority.

Results

  • Whilst only 17% in the focus group preferred to keep to the traditional exam + assessed essay method, the survey found the aversion to exams to be more prominent. 88% of students preferred the Learning Journal over the exam, and 88% cited the likelihood of reducing stress and anxiety as a reason for this preference.
  • Furthermore, none of the survey respondents wanted to retain the traditional exam + assessed essay method, and 52% were in favour of a three-way split between types of assessment; this reflects a desire for significant diversity in assessment methods.
  • We find it helpful to know precisely what students want in terms of feedback: ‘a clear indication of errors and potential solutions’ was the overwhelming response. ‘Feedback that intersects with the Module Rubric’ was the second highest scorer (presumably a connection between the two was identified by students).
  • The students in the focus group mentioned a desire to choose assessment methods within modules on an individual basis. This may be one issue in which student choice and pedagogy may not be entirely compatible (see below).
  • Assessed Essay method: the results seem to indicate that replacing an exam with a second assessed essay is favoured across the Programme rather than being pinned to one Part.

Reflections

The results in the ‘Feedback’ sections are valuable for DEL: they indicate that clarity, diagnosis, and solutions-focused comments are key. In addressing our feedback conventions and practices, this input will help us to reflect on what we are doing when we give students feedback on their work.

The results of the focus group and of the subsequent survey do, however, raise some concerns about the potential conflict between ‘student choice’ and pedagogical practice. Students indicate that they not only want to avoid exams because of ‘stress’, but that they would also like to be able to select assessment methods within modules. This poses problems because marks are in part produced ‘against’ the rest of the batch: if the ‘base-line’ is removed by allowing students to choose assessment models, we would lack one of the main indicators of level.

In addition, the aims of some modules are best measured using exams. Convenors need to consider whether a student’s work can be assessed in non-exam formats but, if an exam is the best test of teaching and learning, it should be retained, regardless of student choice.

If, however, students overwhelmingly choose non-exam-based modules, this would leave modules retaining an exam in a vulnerable position. The aim of this project is to find ways to diversify our assessments, but this could leave modules that retain traditional assessment patterns vulnerable to students deselecting them. This may have implications for benchmarking.

It may also be the case that the attempt to avoid ‘stress’ is not necessarily in students’ best interests. The workplace is not a stress-free zone and it is part of the university’s mission to produce resilient, employable graduates. Removing all ‘stress’ triggers may not be the best way to achieve this.

Follow up

  • DEL will convene a third focus group meeting in the Spring Term.
  • The co-leaders of the ‘Diversifying Assessments’ project will present the findings of the focus groups and surveys to DEL in a presentation. We will outline the results of our work and call on colleagues to reflect on the assessment models used on their modules with a view to volunteering to adopt different models if they think this appropriate to the teaching and learning aims of their modules
  • This should produce an overall assessment landscape that corresponds to students’ request for ‘three-way’ (at least) diversification of assessment.
  • The new landscape will be presented to the third focus group for final feedback.

Links

With thanks to Lauren McCann of TEL for sending me the first link which includes a summary of students’ responses to various types of ‘new’ assessment formats.

https://www.facultyfocus.com/articles/online-education/assessment-strategies-students-prefer/

Conclusions (May 2018)

The ‘Diversifying Assessment in DEL’ TLDF Mini-Project revealed several compelling reasons for reflecting upon assessment practice within a traditional Humanities discipline (English Literature):

  1. Diversified cohort: HEIs are recruiting students from a wide variety of socio-cultural, economic and educational backgrounds and assessment practice needs to accommodate this newly diversified cohort.
  2. Employability: DEL students have always acquired advanced skills in formal essay-writing but graduates need to be flexible in terms of their writing competencies. Diversifying assessment to include formats involving blog-writing, report-writing, presentation preparation, persuasive writing, and creative writing produces agile students who are comfortable working within a variety of communication formats.
  3. Module specific attainment: the assessment conventions in DEL, particularly at Part 2, have a standardised assessment format (33% assessed essay and 67% exam). The ‘Diversifying Assessment’ project revealed the extent to which module leaders need to reflect on the intended learning outcomes of their modules and to design assessments that are best suited to the attainment of them.
  4. Feedback: the student focus groups convened for the ‘Diversifying Assessment’ project returned repeatedly to the issue of feedback. Conversations about feedback will continue in DEL, particularly in relation to discussions around the Curriculum Framework.
  5. Digitalisation: eSFG (via EMA) has increased the visibility of a variety of potential digital assessment formats (for example, Blackboard Learning Journals, Wikis and Blogs). This supports diversification of assessment and it also supports our students’ digital skills (essential for employability).
  6. Student satisfaction: while colleagues should not feel pressured by student choice (which is not always modelled on academic considerations), there is clearly a desire among our students for more varied methods of assessment. One Focus Group student argued that fees had changed the way students view exams: students’ significant financial investment in their degrees has caused exams to be considered unacceptably ‘high risk’. The project revealed the extent to which Schools need to reflect on the many differences made by the new fees landscape, most of which are invisible to us.
  7. Focus Groups: the Project demonstrated the value of convening student focus groups and of listening to students’ attitudes and responses.
  8. Impact: one Part 2 module has moved away from an exam and towards a Learning Journal as a result of the project and it is hoped that more Part 2 module convenors will similarly decide to reflect on their assessment formats. The DEL project will be rolled out School-wide in the next session to encourage further conversations about assessment, feedback and diversification. It is hoped that these actions will contribute to Curriculum Framework activity in DEL and that they will generate a more diversified assessment landscape in the School.

Rethinking assessment design, to improve the student/staff experience when dealing with video submissions

Rachel Warner, School of Arts and Communication Design

Rachel.Warner@pgr.reading.ac.uk

Jacqueline Fairbairn, Centre for Quality Support and Development

j.fairbairn@reading.ac.uk

Overview

Rachel in Typography and Graphic Communication (T&GC) worked with the Technology Enhanced Learning (TEL) team to rethink an assignment workflow, to improve the student/staff experience when dealing with video submissions. Changes were made to address student assessment literacies, develop articulation skills, support integration between practice and reflection, and make use of OneDrive to streamline the archiving and sharing of video submissions via Blackboard.

This work resulted in students developing professional ‘work skills’ through the assessment process and the production of a toolkit to support future video assessments.

Objectives

  • Improve staff and student experiences when dealing with video assignment submissions. Specifically, streamlining workflows by improving student assessment literacy and making use of university OneDrive accounts.
  • Support students to develop professional skills for the future, through assessment design (developing digital literacies and communication skills).
  • Provide an authentic assessment experience, in which students self-select technologies (choosing software and a task to demonstrate) to answer a brief.

Context

The activity was undertaken for Part 1 students learning skills in design software (e.g. Adobe Creative apps). The assignment required students to submit a ‘screencast’ video recording that demonstrated a small task using design software.

Rachel wanted to review the process for submitting video work for e-assessment, and find ways to streamline the time intensive marking process, particularly in accessing and reviewing video files, without compromising good assessment practice. This is also acknowledged by Jeanne-Louise Moys, T&GC’s assessment and feedback champion: “Video submissions help our students directly demonstrate the application of knowledge and creative thinking to their design and technical decisions. They can be time-consuming to mark so finding ways to streamline this process is a priority given our need to maintain quality practices while adapting to larger cohorts.’”

The TEL team was initially consulted to explore processes for handling video submissions in Blackboard, and to discuss implications on staff time (in terms of supporting students, archiving material and accessing videos for marking). Designing formative support and improving the assessment literacy of students was also a key driver to reduce the number of queries and technical issues when working with video technologies.

Implementation

Rachel consulted TEL, to discuss:

  • balancing the pedagogic implications of altering the assignment
  • technical implications, such as submission to Blackboard and storage of video

To address the issue of storing video work, students were asked make use of OneDrive areas to store and submit work (via ‘share’ links). Use of OneDrive encouraged professional behaviours such as adopting a systematic approach to file naming, and it meant the videos were securely stored on university systems using a well-recognised industry standard platform.

To further encourage professional working, students were required to create a social media account to share their video. YouTube was recommended; it is used prolifically by designers to showcase work and portfolios, and across wider professional settings.

Students were provided with a digital coversheet to submit URLs for both the OneDrive and YouTube videos.

The most effective intervention was the introduction of a formative support session (1.5hr). Students practiced using their OneDrive area, set up YouTube accounts and reviewed examples of screencasts. This workshop supported students to understand the professional skills that could be developed through this medium. The session introduced the assessment requirements, toolkit, digital coversheet and allowed students to explore the technologies in a supported manner (improving students’ assessment literacy!)

The assignment instructions were strategically revised, to include information (‘hints and tips’) to support the students’ development of higher production values and other associated digital literacies for the workplace (such as file naming conventions, digital workflows, and sourcing online services).

Students were provided with the option to self-select recording/editing software to undertake the screencast video. Recommended tools were suggested, that are free to use and which students could explore. ‘Screencast-o-matic’ and ‘WeVideo’ provide basic to intermediate options.

Impact

Marking the submissions was made easier by the ability to access videos through a consistent format, using a clearly structured submission process (digital coversheet). The ability to play URL links directly through OneDrive meant Rachel was able to store copies of the videos into a central area for future reference. Students also provided a written summary of their video, highlighting key video timings that demonstrate marking criteria (so the marker does not have to watch whole video).

Rachel rationalised her approach to marking by developing a spreadsheet, which allowed her to effectively cross reference feedback against the assessment criteria (in the form of a rubric) and between assignments. This greatly speeded up the marking workflow and allowed Rachel to identify patterns in students work, where common feedback statements could be applied, as appropriate.

The assessment highlighted gaps in students existing digital literacies. The majority of students had not made a video recording before and many were apprehensive about speaking into a microphone. After the completion of the screencasts, previously unconfident students noted in their module reflections that the screencast task had developed their confidence to communicate and explore a new technology.

Reflections

The modifications to the assessment:

  • Reflected professional digital competencies required of the discipline;
  • Allowed students to explore a new technology and way of working in a supported context; and,
  • Built confidence, facilitated assessment literacy, and encouraged reflection.

Future modifications to the screencast submission:

  • Peer review could be implemented, asking students to upload videos to a shared space for formative feedback (such as Facebook or a Blackboard discussion board).
  • The digital coversheet had to be downloaded to access URL links. In future, students could paste into the submission comment field, for easier access when marking.
  • Rachel is developing a self-assessment checklist to help students reflect on the production values of their work. The summative assessment rubric is focused on video content, not production values, however, it would be useful for students to get feedback on professional work skills. For example, communication skills and use of narrative devices which translate across other graphic mediums.

Toolkit basics:

a thumbnail image of a toolkit document, full access available via links in webpage

  • Outline task expectations and software options, give recommendations
  • Source examples of screencasts from your industry, discuss with students.
  • Provide hints and tips for creating effective screencasts.
  • Provide submission text. Consider asking students to use the ‘submission comment’ field to paste links to their work, for quick marker access to URLs.
  • Plan a formative workshop session, to practice using the software and go through the submission process (time invested here is key!).
  • Create a self-assessment checklist, to enhance the production quality of videos and highlight transferrable skills that can be developed by focusing on the quality of the production.
  • Consider creating a shared online space for formative peer-feedback (e.g. Blackboard discussion forum).
  • Consider using a marking spreadsheet to cross-reference feedback and highlight good examples of screencasts that can be utilised in other teaching.

Links

Screencast example: (YouTube link) This screencast was altered and improved after submission and marking, taking onboard feedback from the assessment and module. The student noted ‘After submission, I reflected on my screencast, and I changed the original image because it was too complex to fit into the short time that I had available in the screencast. I wanted to use the screencast to show a skill that I had learned and the flower was simple enough to showcase this’. Part of the module was to be reflective and learn from ‘doing’, this screencast is an example of a student reflecting on their work and improving their skills after the module had finished.

Screencast example: (YouTube link) This screencast was a clear and comprehensive demonstration of a technique in PhotoShop that requires multiple elements to achieve results. It has a conclusion that demonstrates the student’s awareness that the technique is useful in other scenarios, other than the one demonstrated, giving the listener encouragement to continue learning. The student has used an intro slide and background music, demonstrating exploration with the screencast software alongside compiling their demonstration.

Screencast example: (YouTube link) This demonstrates a student who is competent in a tool, able to use their own work (work from another module on the course) to demonstrate a task, and additionally includes their research into how the tool can be used for other tasks.

Other screencast activity from the Typography & Graphic Communication department from the GRASS project:  (Blog post) Previous project for Part 1s that included use of screencasts to demonstrate students’ achievements of learning outcomes.

Celebrating Student Success Through Staff-Student Publication Projects

Dr Madeleine Davies, Department of English Literature

m.k.davies@reading.ac.uk

Overview

In 2017 I replaced the exam on a Part 3 module I convene (‘Margaret Atwood’) with an online learning journal assessment and I was so impressed with the students’ work that I sought funding to publish selected extracts in a UoR book, Second Sight: The Margaret Atwood Learning Journals. The project has involved collaboration between the Department of English Literature and the Department of Typography & Graphic Communication, and it has confirmed the value of staff-student partnerships, particularly in relation to celebrating student attainment and enhancing graduate employability.

Objectives

  • To showcase the achievements of our Part 3 students before they graduate and to memorialise their hard work, engagement and ingenuity in material form
  • To demonstrate at Open Days and Visit Days the quality of teaching and learning in the Department of English Literature in order to support student recruitment
  • To create a resource for students enrolling on the module in future years
  • To encourage reflection and conversation in my School regarding the value of diversified assessment practice

Context

The ‘Margaret Atwood’ module has always been assessed through an exam and a summative essay but I was dissatisfied with the work the exam produced (I knew that my students could perform better) so I researched alternative assessment formats. In 2017 I replaced the exam with a Blackboard learning journal because my research suggested that it offered the potential to release students’ creative criticality. I preserved the other half of the assessment model, the formal summative essay, because the module also needed an assessment where polished critical reading would be rewarded. With both assessment elements in place, students would need to demonstrate flexible writing skills and adapt to different writing environments (essential graduate skills). A manifest benefit of journal assessment is that it offers students to whom essay-writing does not come easily an opportunity to demonstrate their true ability and engagement so the decision to diversify assessment connected with inclusive practice.

Implementation

I decided to publish the students’ writing in a UoR book because I did not want to lose their hard work to a digital black-hole: it deserved a wider audience. I sought funding from our Teaching and Learning Deans, who supported the project from the beginning, and I connected with the ‘Real Jobs’ scheme in the Department of Typography & Graphic Communication where students gain valuable professional experience by managing funded publishing commissions for university staff and external clients. This put me in contact with a highly skilled student typographer with an exceptional eye for design. I asked a member of the ‘Margaret Atwood’ group to help me edit the book because I knew that she wanted to pursue a career in publishing and this project would provide invaluable material for her CV. Together we produced a ‘permissions’ form for students to formally indicate that they were releasing their work to the publication and 27 out of 36 students who were enrolled on the Spring Term module responded; all warmly welcomed the initiative. Contributors were asked to submit Word files containing their entries so as to preserve the confidentiality of their online submissions; this was important because the editors and designers were fellow students. Throughout the Summer Term 2018, the students and I met and planned, designed and edited, and the result is a book of which we are proud. With the sole exception of the Introduction which I wrote, every element of it, from the cover image to the design to the contents, is the work of our students.

Impact

The impact of the project will be registered in terms of Open Days because Second Sightwill help demonstrate the range of staff-student academic and employability activities in DEL. In addition, the project has consolidated connections between DEL and the Department of Typography & Graphic Communication and we will build on this relationship in the next session.

A further impact, which cannot be evidenced easily, is that it provides a useful resource for our graduates’ job applications and interviews: students entering publishing or journalism, for example, will be able to speak to their participation in the project and to their work in the book. The collection showcases some excellent writing and artwork and DEL graduates can attend interviews with tangible evidence of their achievements and abilities.

Reflections

Producing this book with such talented editors, designers and contributors has been a joy: like the ‘Margaret Atwood’ module itself, Second Sight confirms the pleasures and the rewards of working in partnership with our students.

The project sharpened my own editing skills and created a space to share knowledge about publishing conventions with the students who were assisting me. We all learned a great deal from each other: June Lin, the Typography student designer, gave me and the student editor (Bethany Barnett-Sanders) insights into the techniques of type-setting and page layout. To reciprocate, Bethany and I enhanced June’s knowledge of Margaret Atwood’s work which she had read but never studied. This pooling of knowledge worked to the benefit of us all.

One of the advantages of the learning journal was that it allowed me a clear view of the inventiveness and ingenuity that students bring to their work, and my sense of appreciation for their skill was further enhanced by working with students on the book. Technically, this was less of a ‘staff-student’ collaboration than it was a mutual education between several people.

The process we followed for acquiring written permission from students to include their work in the book, and for gathering Word files to avoid confidentiality issues, was smooth, quick, and could not have been improved. The only difficulty was finding time to edit seventy-five contributions to the book in an already busy term. Whilst this was not easy, the results of the collaboration have made it well and truly worth it.

Follow up

It is too early to tell whether other DEL colleagues will choose to diversify their own assessments and pursue a publishing project similar to the ‘Margaret Atwood’ example if they do. There is, however, a growing need for Open Day materials and Second Sight joins the Department’s Creative Writing Anthology to demonstrate that academic modules contain within them the potential for publication and collaborative initiative. I will certainly be looking to produce more publications of this nature on my other learning journal modules in the next session; in the meantime, copies of Second Sight will be taken with me to the outreach events I’m attending in July in order to demonstrate our commitment to student engagement, experience and employability here at the University of Reading.

Related entries

http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-using-focus-groups-to-diversify-assessment/

http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-using-focus-groups-to-diversify-assessment-part-2/

http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-in-student-participation-at-academic-conferences/

 

Engaging students in assessment design

Dr Maria Kambouri-Danos, Institute of Education

m.kambouridanos@reading.ac.uk

Year of activity 2016/17

Overview

This entry aims to share the experience of re-designing and evaluating assessment in collaboration with students. It explains the need for developing the new assessment design and then discusses the process of implementing and evaluating its appropriateness. It finally reflects on the impact of MCQ tests, when assessing students in higher education (HE), and the importance of engaging students as partners in the development of new assessment tools.

Objectives

  • To re-design assessment and remove a high-stakes assessment element.
  • To proactively engage ‘students as partners’ in the development and evaluation of the new assessment tool.
  • To identify the appropriateness of the new design and its impact on both students and staff.

Context

Child Development (ED3FCD) is the core module for the BA in Children’s Development and Learning (BACDL), meaning that a pass grade must be achieved on the first submission to gain a BA Honours degree classification (failing leads to an ordinary degree). The assessment needed to be redesigned as it put the total weight of students’ mark on one essay. As the programme director, I wanted to engage the students in the re-design process and evaluate the impact of the new design on both students and staff.

Implementation

After attending a session on ‘Effective Feedback: Ensuring Assessment and Feedback works for both Students and Staff Across a Programme’ I decided to explore more the idea of using Multiple Choice Tests (MCQ). To do so, I attended a session on ‘Team Based Learning (TBL)’ and another on ‘MCQ: More than just a Test of Information Recall’, to gather targeted knowledge about designing effective MCQ questions.

I realised that MCQ tests can help access students’ understanding and knowledge and also stimulate students’ active and self-managed learning. Guided by the idea of ‘assessment for learning’, I proposed the use of an MCQ test during a steering group meeting (employees and alumni) and a Board of Studies (BoS) meeting, which 2nd year Foundation Degree as well as BACDL student representatives attended. The idea was resisted initially, as MCQ tests are not traditionally used in HE education departments. However, after exploring different options and highlighting the advantages of MCQ tests, the agreement was unanimous. At the last BoS meeting (2016), students and staff finalised the proposal for the new design, proposing to use the MCQ test for 20% of the overall mark, keeping the essay for the remaining 80%.

At the beginning of 2017, I invited all BACDL students to anonymously post their thoughts and concerns about the new design (and the MCQ test) on Padlet. Based on these comments, I then worked closely with the programme’s student representatives and had regular meetings to discuss, plan and finalise the assessment design. We decided how to calculate the final mark (as the test was completed individually and then in a group) as well as the total number of questions, the duration of the test, etc.  A pilot study was then conducted during which a sample MCQ test was shared with all the students, asking them to practise and then provide feedback. This helped to decide the style of the questions used for the final test, an example of which is given below:

There are now more than one million learners in UK schools who speak English as an additional language (EAL). This represents a considerable proportion of the school population, well above 15 per cent. To help EAL children develop their English, teachers should do all the following, except…

a. use more pictures and photographs to help children make sense of new information.

b. use drama and role play to make learning memorable and encourage empathy.

c. maintain and develop the child’s first language alongside improving their English.

d. get children to work individually because getting them into groups will confuse them and make them feel bad for not understanding.

e. provide opportunities to talk before writing and use drills to help children memorise new language.

Impact

Students were highly engaged in the process of developing the new design, and the staff-student collaboration encouraged the development of bonds within the group. The students were excited with the opportunity to actively develop their own course and the experience empowered them to take ownership of their own learning. All of them agreed that they felt important and as a student representative said, “their voices were heard”.

The new design encouraged students to take the time to gauge what they already know and identify their strengths and weaknesses. Students themselves noted that the MCQ test helped them to develop their learning as it was an additional study opportunity. One of them commented that “…writing notes was a good preparation for the exam. The examination was a good learning experience.” Staff also agreed that the test enabled students to (re)evaluate their own performance and enhance their learning. One of the team members noted that the “…test was highly appropriate for the module as it offered an opportunity for students to demonstrate their proficiency against all of the learning outcomes”.

Reflections

The new assessment design was implemented successfully because listening to the students’ voice and responding to their feedback was an essential part of the designing process. Providing opportunities to both students and staff to offer their views and opinions and clearly recognising and responding to their needs were essential, as these measures empowered them and helped them to take ownership of their learning.

The BACDL experience suggests that MCQ tests can be adapted and used for different subject areas as well as to measure a great variety of educational objectives. Their flexibility means that they can be used for different levels of study or learning outcomes, from simple recall of knowledge to more complex levels, such as the student’s ability to analyse phenomena or apply principles to new situations.

However, good MCQ tests take time to develop. It is hoped that next year the process of developing the test will be less time-consuming as we already have a bank of questions that we could use. This will enable randomisation of questions which will also help to avoid misconduct. We are also investigating options that would allow for the test to be administered online, meaning that feedback could be offered immediately, reducing even further the time/effort required to mark the test.

Follow up

MCQ tests are not a panacea; just like any other type of assessment tool, MCQ tests have advantages and limitations. This project has confirmed that MCQ tests are adaptable and can be used for different subject areas as well as to measure a great variety of educational objectives. The evaluation of the assessment design will continue next year and further feedback will be collected by the cohort and next year’s student representatives.

Independent research and research dissemination in undergraduate teaching

Dr. Ute Woelfel, Literature and Languages
u.wolfel@reading.ac.uk
Year of activity: 2016/17

Overview

In order to improve students’ engagement, support their abilities as independent learners, and increase their feeling of ownership for their academic work, elements of independent research and research dissemination through the creation of research posters were included in a Part 2 module.

Objectives

  • Boost independent learning.
  • Nurture research interests.
  • Increase feeling of ownership.
  • Develop employability skills.

Context

In 2016/17 I introduced a new Part Two module on German National Cinema (GM2CG: 20 credits/ 30 contact hours). The module is intended to give students a general overview of German cinema from the end of World War I to German unification and at the same time allow sustained independent work on themes of interest. In order to increase the engagement with the themes, the independent work is research-oriented demanding from students to reflect their own expectations and aims, their goals for the module and indeed the course, and develop their own interest and approach.

Implementation

The students were asked in the beginning to pick a period or topic from a list and prepare a presentation. The presentation was not part of the summative assessment but served as a foundation for further research. After the presentation, individual discussions with each student were used to decide which aspect of the theme/topic the student would like to pursue further. After each term, essay surgeries were offered in which students were given the opportunity to discuss the research done so far and decide a concrete research question for their essay (2,500 words/ 30%). The students were then asked to turn the findings of their essays into research posters for dissemination to non-specialist audiences (10%). In order to make sure that students also gain a general understanding of German cinema, a final exam (60%) is scheduled in the summer term.

Impact

The inclusion of independent research elements was very successful in that students did engage more than they normally do when given set topics and essay titles. The majority of students found secondary sources, even additional primary sources, and often identified research topics they would like to pursue in the future. Both the essay and the exam marks were above average. The poster challenged students to re-think their academic findings and present them in a new, visually organised, format for interested general audiences; as we used the posters to showcase the students’ work at the University’s Languages Festival, the Visit Days and a Reading Scholars outreach event, a sense of the importance of their work emerged as well as pride in what they had achieved grew. The students understood the relevance of the poster for the development of professional skills.

Reflections

The module worked well and highlighted most of all the potential our students have and can develop in the right learning environment as well as their willingness to work hard when they are committed. Their engagement with independent research signalled a wish to get active and explore options beyond the set class texts rather than being spoon-fed; there is a clear need for feeling involved, responsible and in charge of work. I was particularly surprised about how much effort students were prepared to put into the presentations despite the fact that they did not count towards the module mark; as they were used as foundation for assessment, students clearly understood their benefit.

The research elements made the module learning and teaching intensive as a good number of office hours and slots during the enhancement weeks were used for individual discussions of research and essay topics; as I want the students to put their research posters to good use as well, additional feedback slots were offered in which I discussed not just marks but ways of improving the posters; students showed great willingness to work even further on their posters just to see them exhibited, despite the fact that any further input would not change the mark.

Using online learning journals

Dr Nicola Abram, School of Literature and Languages

n.l.abram@reading.ac.uk

Overview

This entry describes the use of online Learning Journals on a Part Three English Literature module. This method of assessment supports students to carry out independent research and to reflect on their personal learning journey, and rewards students’ sustained engagement and progress.

Objectives

  • To encourage reflective learning.
  • To promote independent learning.
  • To facilitate weekly cumulative contributions to summative assessment.
  • To reward development rather than final attainment.

Context

The Part Three optional module Black British Fiction (EN3BBF) is characterised by a large number of set texts that are read at a fast pace. During a single term it covers the period from 1950 to the present day, and asks students to engage with novels, short stories, poetry, a play, and a film, as well as critical theory, history, autobiography, documentary, blogs, political speeches, and press reviews. The module is also characterised by its relevance to historical and contemporary issues of social justice. The quantity and complexity of this material requires students to exercise their independence, taking responsibility for their learning beyond the weekly three hours of tutor-led seminars.

Learning Journals had been in use for this and other modules in the Department of English Literature for several years, in the format of paper workbooks pre-printed with set questions. This effectively served the purpose of structuring students’ weekly studies and directing discussion in seminars. Students worked extremely hard to record their learning in this format, often going beyond the standard material to include additional reading and research of relevance to the module.

However, the paper workbook sometimes resulted in an excess of material that was diluted in focus and difficult to evaluate. Another problem was that the handwritten Journal was retained by the University after submission, meaning students lost this rich record of their learning.

To improve this situation, consultations were held with colleagues in the Department of English Literature and an alternative online Learning Journal was initiated in 2015/16.

Implementation

Experimentation with the Blackboard Journals tool helped to clarify its privacy controls, to ensure that tutors could see the work of all participating students but that students could not see each other’s entries. A discussion with the University of Reading TEL team clarified marking procedures, including making the Journal entries available to view by external examiners.

A discussion was held with colleagues who use paper or online Learning Journals, to establish generic assessment criteria and ensure parity of expectations.

In discussion with another module convenor it was decided that students would be required to submit ten weekly entries, each consisting of 400-500 written words or 4-5 minutes of audio or film recording. The choice of media was a proactive effort to make the Journal more accessible to students with dyslexia and those for whom English is an additional language. The subject of each entry could be determined by the student, prompted by questions on the reading list, discussion in seminars, personal reading, or other activities such as attendance at an exhibition or event.

In the first term of implementation (Autumn 2015) the full ten entries were assessed. In later iterations it was decided that students should instead select five entries to put forward for summative assessment. The selection process facilitates further self-reflection, and the option to discard some entries allows for experimentation without the threat of penalty.

The Learning Journal incorporates a vital formative function: students are invited to a 30-minute feedback tutorial to discuss their first five entries. This conversation refers to the module-specific and task-specific assessment criteria, supporting students to reflect on their work so far and to make plans to fill any gaps. The Learning Journal functions as a mode of assessment for learning, replacing the traditional task of the formative essay.

In terms of summative assessment, the five submitted Learning Journal entries account for 50% of the module mark. An essay constitutes the other 50%. These two forms of assessment are equivalent in scale, with each carrying a guideline of 2,500 words total.

Impact

The fact that students could nominate a selection of entries for summative assessment seemed to encourage risk-taking. Students were more willing to experiment with their critical responses to texts – by testing speculative interpretations, asking questions, or articulating uncertainty – and to express their ideas using creative practices. They became actively engaged in directing both the form and content of their learning.

The move to a restricted length per entry was designed to encourage students to distil their ideas, and to direct attention to the aspects of that week’s learning that most mattered to the student. This was successfully achieved, and feedback shows that they could see their own progress as the weeks passed.

Feedback also showed that students appreciated the opportunity to choose their own topic for each weekly entry, without the constraints of set questions. As a result, entries were remarkably varied. Some students took the opportunity to reflect on their personal circumstances or current political contexts (such as the construction of ‘Britain’ in the discourse around the EU referendum in 2016) using the technical vocabulary learned on the course; others explored creative media such as spoken word poetry. All students gained skills in a genre of writing different from the traditional essay format, which may prove useful for careers in the communication industries.

One unexpected benefit was that the online journal made it possible for the module convenor to track the students’ learning in real-time rather than waiting for summative assessments and end-of-term evaluations. This immediate insight enabled corrective action to be taken during the course of the module where necessary.

Reflections

Students were initially nervous about this unfamiliar method of assessment. Providing detailed module-specific and task-specific marking criteria, as well as example entries, helped to allay these fears. The decision to count only a selection of entries towards summative assessment significantly helped, allowing students to acclimatise to the task with more confidence. As the term progressed, students visibly transitioned towards autonomous learning.

The Learning Journal format proved particularly effective for this module as it created a ‘safe space’ in which students could reflect on the ways in which they have personally experienced, witnessed, or practised racism. Students’ self-reflection extended beyond the subject of skills, strengths and weaknesses to consider their embodied knowledge, ignorance, or privilege. They became more critical in their thinking and more alert and responsible as citizens. Articulating the potency of this real-world engagement, one student commented that “the consistency of the learning journal […] allowed my thinking to naturally mature and changed my outlook on society”.

Marking the Journals became much more efficient using the online format, as entries were typewritten and significantly condensed. Additionally, marking and moderating could be done remotely, without the need to exchange cumbersome documents in person.

It is striking that some students achieving high marks in their Learning Journals did not always achieve equivalent marks in their essays or other modules. I do not consider this to indicate an artificial inflation of grades; rather, I would argue that the Journal recognises and rewards skills that are overlooked in traditional assessment formats and undervalued elsewhere on our programmes. Some students used the Journal to record their personal contribution to seminar discussions and be rewarded for this, while for other students less likely to speak in class (perhaps due to EAL status, gender, disability, or personality) the private entries provided an important opportunity for their insights to be heard.

Follow up

Informal spoken feedback on the general use of Learning Journals was given to the group during seminars, and one-to-one feedback was given halfway through the module. However, several students sought additional reassurance about their entries. In 2017/18 I intend therefore to incorporate a peer-review exercise into the early weeks of the term, to allow students to benchmark their work against others’ and to promote the take-up of alternative media and approaches. This activity will help students to see themselves as a community of learners. Rather than presume that students have access to technology I will supply iPads belonging to the School of Literature and Languages for use in the classroom.

I also intend to circulate example entries in audio and video formats, to show that the Journal validates skills other than traditional essay-writing and to encourage students to experiment with alternative ways of demonstrating their learning.