School of Psychology & Clinical Languages Sciences
In this 14 minute video, early rubrics adopter Dr. Allan Laville shares how he and colleagues in Psychology have sought to improve student assessment literacy, and have successfully engaged students with their assessment rubrics by embedding analysis of them into their in-class teaching and by using screencasts, discussion boards and student partnership. Lots of useful ideas and advice – well worth a watch.
Will Hughes – School of Built Environment (Construction Management & Engineering)
The personal capture pilot project helped me to develop and test ideas to advance what I had been previously trying using YouTube. One important lesson for me was that shorter duration videos better engage students. I also learned how to record videos featuring more than simply a talking head. Using this technology for augmenting the usual pedagogic techniques was very useful. I would like to replace some of my lecturing using screen-cast videos, but I have learned that there is more to this than simply recording pre-prepared lectures for my students.
My aim was to produce detailed explanations of points too elementary or too complex to address in lectures and to replace some one-to-one meetings. I aspired to produce a series of 5-10 minute videos that responded to specific student questions generated from lectures and emails. One specific idea was to support reflective portfolio writing.
My motivation to join the personal capture project was to acquire screen-casting skills and to better understand the technology.
There were two key groups I chose to produce recordings for:
40 MSc students, of whom some were flexible-modular and off-campus except when there were formal classes. The main module was CEM102: Business of Construction.
142 BSc students on a Part 2 module: CE2CPT Construction Procurement.
I tried using the webcam and laptop provided in the pilot. With these, I made some videos using the Mediasite tool, but the video and audio quality were not as high as I would have liked and the editing offered by Mediasite was very primitive, with no opportunity to fix issues like colour grading, for example. I preferred using my own professional-grade camera, microphone and lighting. I realised that I needed much better software than Mediasite and bought a license for Camtasia, which opened up a lot of interesting possibilities and made it possible to achieve what I had in mind.
Dialogue with students was around presenting them with a video and asking them to let me know what they thought, whether it helped and what kind of things they would like me to cover in future.
The most well-received videos were those that summarised assignment guidance in 10-11 minutes. My video on research conceptualization proved popular. The assignment summaries in CEM102 Business of Construction, for a Reflective Portfolio and a Case Study, were very impactful and prompted a lot of student approval.
One unanticipated experience was in using the technology for replacing a lecture cancelled due to bad weather; 66% of the students accessed this 55-minute lecture but for an average view time of only 18 minutes which I found to be a depressing statistic.
Things improved as I progressed. Planned use of personal capture was much better than using it to overcome lecture cancellations. The pedagogical challenge is to figure out how to produce short videos that are useful to students. It was useful to work out how to provide simple overviews of things that would be helpful in the students’ learning and produce short videos based on this. I found filming at home better than filming in the office.I have learned the importance of issuing reminders about Blackboard-posted videos as students can miss the initial announcement and then never see the video produced for them.
I found the Mediasite tool itself clunky and challenging in terms of its permissions, lack of utility and quality.
I still believe personal capture is useful but I am thinking about changing my strategies for how to use it. The changes are not technical put pedagogical. As I move to part-time working and ahve less contact time with students, personal capture may become indispensable for me.
Ed Collins – School of Agriculture, Policy & Development
Personal capture software was used as a method of coaching, facilitating good study practice and identifying milestones for students in order to develop excellent assignments over two modules at undergraduate level. The impact of the project delivered was two-fold. From a student’s perspective, it enabled the students to prepare independently for the various assessments allowing them to re-listen to the advice given. From a lecturer’s perspective, it decreased to amount of face-to-face contact hours whilst maintaining high standards of tutelage.
The objective of the project was to offer an enhanced learning experience by adopting online personal capture tools to produce video resources. The focus was on helping the students prepare for the assessments that were associated with the module.
The project used 2 undergraduate modules: a second year Marketing Management module with 120 students and a final year Business strategy module with 70 students. The modules lent themselves to alternative knowledge delivery due the size of the cohort and also the types of assessment. The screen-cast allowed the students to prepare each element in an organised way but also allowed the students a certain degree of flexibility as it reduced the amount of face-to-face tutorials.
From the start, the students were involved in the process. Student reps were selected and were consulted after every recording. This helped to get traction from a student point of view but also to get a sense of the reception the recordings were getting. Making the students aware of the recordings was imperative and a follow-up email when they were released was sent out. The recordings were of me delivering to camera (without use of any slides). I chose this format as I felt the students would focus more on what I said and make their own notes, rather than depending on slides.
The objectives of the project were achieved. From a student’s perspective, they could download and listen to the videos at will and did so repeatedly. The recordings guided the student through the content delivered in the lecture but also through the development of the assessments. Sign-posting readings and suggesting best practice in the development of the assessment formed a structural point of view and formed the main thrust of the message of the recordings. An unexpected outcome was the reduction of face-to-face time I had with my students. There was less demand on my office hours which is both good and bad as I feel it is important to encourage the students to talk to their module leaders outside of class.
In my experience, Mediasite (the personal capture software used for the pilot project) did not work as smoothly as I had hoped. As a result, I adopted the software that my Dell computer recording studio offered. I am concerned that students may have unrealistic expectations about the quality of captured recordings. I feel that students are now used to high quality vlogging on YouTube and other platforms and may have an expectation that all videos produced as learning resources in their university experience need to be highly professional.
I plan to expand my use of personal capture in my practice to include the marking of scripts and giving students feedback, as well as preparing students for assignments. I will use post-graduate classes to test this in the forthcoming academic year. I will also be mentoring other staff members to use recordings as much as possible for their courses in the same context as myself during this project.
Calvin James Smith – Department of Maths & Statistics
We created short videos advertising the content of modules to enable students to make more informed choices during the module selection process. Staff reported mixed experiences and interest in Mediasite personal capture so other mechanisms were also used (e.g. Camtasia, use of camcorder). Student feedback was positive and did not single out a preferred model of video recording.
To create a library of short videos promoting module content to support the module selection process. The library is to be made available via a Blackboard Organisation. Videos should be:
Short / concise
Focus on main content of module (not elements whose emphasis depends on staff delivering the module)
Student feedback had revealed that students were feeling there was a lack of guidance and support around module selection, with some students reporting that they only discovered a module wasn’t for them after it was too late to change. Historically, we had provided module selection advice via the tutor system and carousel style talks after the exams periods or in Week 6; however, these mechanisms have experienced declining levels of student participation / efficacy in recent years so a new approach was trialled using the Personal Capture pilot.
We made videos for a wide range of Part 2 and final year modules and made these available in Blackboard alongside “pathway diagrams” showing the pre-requisites linking modules.
Maths and Statistics has a mixed relationship with use of screen-casts (typically linked to difficulties in capturing mathematical notation) so it was necessary to develop options for producing videos to enable colleagues to select the mechanism which worked best for them. Working with a colleague, Hannah Fairbanks, we put together two sample videos for MA2MPH (produced using Mediasite) and ST2LM (using a camcorder), and shared these with colleagues alongside an offer of support to produce their own content. No pressure or steer to use one mechanism (Mediasite or camcorder) was provided, rather we prioritised ease of producing AV content in a way colleagues felt comfortable with. Typically, Hannah or I would arrange a time to meet with colleagues and support them one-to-one. In addition, some staff used the Camtasia tool.
We spoke with students continuously throughout the process to receive feedback on what was useful content, both informally and using a feedback survey.
We created module selection videos for 06 (of 15) Part 2 modules and 10 (of 24) final year modules. These recordings were made available in a Blackboard Organisation called Maths Module Selection, alongside pathway information about how the modules fitted together both in- and between- years, alongside conventional resources such as the module catalogue and programme .
Staff involvement with the Personal Capture pilot did appear to promote additional discussions about inclusive practices and accessibility of resources.
Student was broadly positive indicating that this was a suitable solution to the challenge of supporting their selection of optional modules.
I was particularly pleased to be able to provide inclusive module selection support at times that suited students rather than being conditional on staff availability, etc. However, I was unable to convince all colleagues delivering optional modules of the merits of producing these videos so our coverage is not complete; student feedback identified the deficit and has asked for the remaining videos to be produced. It is undeniable that some staff were put off due to the additional burden of producing transcripts in order to meet our accessibility obligations (although we have had some successes using Google Docs to ease production of these).
Staff who had already developed slides for module delivery typically were more willing to engage with the process (talking over these) but otherwise it was challenging to solicit involvement with broad reluctance to engage in ‘talking head’ or being filmed at board activities.
We won’t know if this has been a successful means for supporting module choice until we see a reduction in ‘module tourism’ in the 2019-20 cycle.
I’m hoping that now a bank of videos is available that we can “fill in the gaps” on a more leisurely timescale enabling colleagues to contribute without the time pressures of the pilot project.
Dr Rhianedd Smith – University Museums & Special Collections Services
This project looked at creating extension videos to help people to access behind-the-scenes and professional skills from museums and heritage sites which could not be delivered in class. It also allowed revision for skills which might be lost due to issues around memory in sometimes overwhelming live workshops.
Use personal capture to enhance a more general strategy of digital scaffolding for students working with collections.
Teach students behind-the-scenes skills e.g. using a catalogue or packing museum objects.
Allow access to expertise which might not be possible in a classroom setting through recorded interviews with staff.
Creating a format that will help students with additional needs to revise skills which may require repeat instructions or access outside of an overwhelming live action workshop.
We have an increase in staff wanting to use collections and field visits for teaching which meant a greater cohort of novice staff and students to support through basic skills training. We only have a small team of professional staff and when they were used in class, it was often to deliver standard information, not making the best use of their expertise and knowledge of the collections. It also meant that time with collections was being wasted for students and academic staff.
For certain practical workshops, it was very hard for students to catch up via lecture notes. We realised that certain skills were actually more appropriate for one-to-one training which we could not support face-to-face e.g. taking somebody through catalogue searching. Through talking with the RUSU Disability Rep and the Disability Office, we also realised that practical workshops can be stressful, tiring and overwhelming for some disabled students, students with mental health issues, or students with chronic illness. While we still focus on hands-on teaching, we realise some students might be doubly disadvantaged around core skills by being more likely to miss classes due to illness or anxiety about the format and then having no way to catch up.
We have a small cohort so we have been able to discuss with them over several years what kinds of scaffolding might be needed. This year, we did some evaluation of our modules funded by T&L Dean Elizabeth McCrum which revealed, for example, that students felt they needed more museum-specific guidance around project planning and team work. We also had ongoing meetings with the RUSU Disability Rep (Blythe Varney) who was working on a project to create volunteering resources for students with disabilities.
Adam Lines, Nicola Pickering and I have worked together on this. We created a Trello board to suggest possible videos which might be created and have been interviewing staff.
We have tended to make the videos unscripted but with multiple tries at some of the commentary. Using PowerPoint slides helps and Rhi and Adam have used them in a similar way to how they are used in lectures where the slides are prompts.
Depending on the topic, we’ve used a mix of narrated PowerPoint, screen capture with a talking head, straight video uploaded, and we’re looking at pure audio with images for some of the staff interviews. We’ve made some of the videos through uploading via material recorded on an SLR camera, or using an app on an iPhone.
We had to edit the videos. With the PowerPoint ones, it was much easier (just trimming the beginning and end and editing the intro panels). For more complex videos, we utilised Final CutPro which we have as a result of #DigiRDG.
Two students made some scripted film via SLR (funded by the Diversity and Inclusion fund) and we’re also hoping to use this. Given the reliance on the desktop or laptop, we’re thinking any future student input is probably going to be captured using the iPhone Mediasite app. Students also contributed photographs for the field-site visit and appeared in the film.
Adam has actually made some of the videos publicly available as he realised that they would be useful for all visitors, not just students. We’ll be embedding these in a new Museums & Collections Portal launching in December. Rhi is fairly digitally literate but has not used video before. A lot of her scaffolding content on Blackboard was text-based and this definitely changed the way that she thought about scaffolding options. At the time of writing, we haven’t had the chance to fully evaluate the student response to the new video content yet but will be working with key stakeholders over the next year as part of a wider TLDF project about skills provision. We’re running some of our videos with a new first year cohort in mind (e.g. the guides to the museums) so we should be getting feedback on them in the next couple of months. We’ll be asking questions in class and using Mediasite and Blackboard analytics to gather data on use. We’ll also be asking students for requests for videos.
Our videos were sometimes quite ambitious and thus we sometimes took more time to organise with other staff. We found that other staff were also concerned about being on camera and had to be properly prepped. This has meant that we haven’t fully explored the options for making the kind of ‘quick fix’ videos which personal capture lends itself well to. Our ambition also meant that we sometimes had to bring in other technologies and software to produce the screen-casts but use Mediasite to embed them within Blackboard. For example, the fieldwork video at a National Trust site required use of a phone (still using the Mediasite app). Adam had had training in video creation as part of #DigiRDG so he used a camera and more sophisticated editing software for some of the behind-the-scenes videos which required high quality footage and mobility (not always possible with a screen camera).
We haven’t had the chance to fully evaluate our use of personal capture yet as we had to use the summer to create a lot of the content. This should be something we take into consideration in the future when creating updates each year.
We have been looking at digital and in-class scaffolding more generally so Mediasite became an important part of that toolkit when thinking this through. We have received TLDF funding to explore skills training through working with collections and will be creating something like UCL’s ABC workshop (but with collections as the focus). We’d like to include information on personal capture as one of the ways of supporting learning when developing a collections module. The pilot project definitely allowed us to think about exactly what personal capture might be good for in this context and we can use this knowledge to train other academics and professionals
The main motivation for me to use personal capture was to create short videos (lasting six to seven minutes), explaining various financial concepts in Excel. I teach Financial Modelling (a practical hands-on module, taught in a computer lab) to Part 2 students,for which I createdthe videos using the Mediasite software. I then uploaded them toBlackboard(via a video library) for this module. I was consulting with colleagues working with me on using personal capture along with students while undertaking this task, and received strong constructive feedback from students regarding their increased understanding and the usefulness of the videos as revision materials. Responses to the survey I sent out were also highly positive with students strongly agreeing with the positive aspects of this project.
My objectives during the personal capture pilot project were:
To enable students to go back to the videos as often as they want. There are lots of small steps involved in Excel, in the process to answering a numerical question – so students can catch-up on concepts they missed or did not understand the first-time round.
To allow students to watch the videos as a revision guide/tool.
To facilitate group projects– these videos can be accessed and watched by all members of the group, hence avoiding any misunderstanding of the process.
To support students who wanted to get ahead of the lectures by giving them the opportunityto look at the videos beforehand.
To encourage students to self-study and hence, become more independent learners.
Financial Modelling is anapplied module, involving the use of Excel for its implementation. There are many steps involved in the computation of a financial model, hence personal capture is very helpful as it allows the students to get greater control of their learning e.g. pausing the videos and re-watching relevant parts.
Part 2 students made a request at our SSLC meeting (in December 2018) to have recordings for my module (IC206 Financial Modelling) – for reasons such as accessing the Excel explanations when attempting tutorials questions and for revising. However, there is currently no such facility in our Dealing Room where I teach this module. At the same time, staff were being invited to apply for this Personal Capture Pilot scheme which I thought would be brilliant for my module.
During the lectures, I had to go and see/help students individually at their computers when they were stuck on certain features – which would sometimes mean that I might not finish the lectureon time. Hopefully, with the availability of these videos beforehand, this problem might not occur.
Following a bumpy start getting set up, I found the process of recording very straight forward. I followed TEL’s step-by-step guidance on Blackboard about how to do the recordings (on My Mediasite) and transferring the videosinto the Video Library where students can access them.
There were no students involved in the actual making of those videos but having spoken to them (in person)about this project, my opinion was that they would absolutely love this concept. From the student survey I sent out, one of the respondents claimed that he/she is jealous of the future cohort having access to these video resources!
Though the response rate to the student evaluation survey was not very high, given the time of year they received it, everybody who filled in the survey had only positive comments.
The results from the survey were extremely positive, with students strongly agreeing as a result of having access to the screen-cast videos with (1) improved revision notes, (2) increased understanding of the materials, (3) engaging videos and lessons, (4) appropriate communications, (5) greater control of learning, (6) useful tool for catching-up, (7) complementing lecture notes and (8) discussions with peers as a result, among others. Students also mentioned that they are highly likely to watch those videos again. I also got a comment that this will not stop them coming to see me on an individual basis – which is good because the videos are not a substitute for this.
Based on the above initial comments, I would say that my objectives were met. As the response rate to my survey was not high, I did not receive any unexpected outcomes. I would expect more and positive feedback when I repeat this work with students next year.
I have been working with a small number of HBS colleague as co-partners on the project to help support them to use Mediasite for personal capture. Many colleagues in the Henley Business School are aware of the pilot project and interested by the potential in using screen-cast videos, contact hours with students might be reduced prior to tests, coursework deadlines and exams as students can refer to video support materials. Based on my conversations, many are keen to adopt personal capture once it is made available to all staff.
The first challenge was to install the Mediasite software on my personal computer. I found it hard to set-up and I had to get the IT team involved – this was not very straightforward. However, once the software was installed and running, the whole process of creating videos and making them available on Blackboard was plain-sailing. Lots of detailed guidance was given by the TEL team on Blackboard.
The personal capture pilot project started in December 2018/January 2019 and I created these videos for a Part 2 module which takes place in the autumn term. This meant that I could not get a response from the initially targeted cohort. Ideally, I should have done this project for a module which I teach over the spring term – in this way I would have received contemporaneous feedback from the required audience.
The main challenge for using personal capture is to engage students, to ensure that students take the opportunities to watch the videos and gain knowledge from the process. I should have a good idea about this next year as an early adopter of the technology.
I did not send the evaluation survey to the current students who were studying for the module (for whom the videos were made) this academic year. Hence, I would like to introduce this Personal Capture project to the Part 2 students during the autumn term of the 2019-2020 academic year and ask for them to complete a survey. One strategy for its evolution might be that the students would watch the basic Excel videos before the lectures – freeing some time at the end for attempting some of the tutorial questions.
Elisabeth Koenigshofer – School of Literature & Languages
As part of the Personal Capture Pilot Project, I had many ideas to create videos for my students to enhance their learning. Eventually, I created one video that students could refer to if they needed support when adding a post to a blog on Blackboard, which was one of their tasks in their first year German language course.
Help students understand class content more easily
Provide reference material for students
Help absent students to catch up with class content
Provide a different format for content than usual, engaging a wider variety of learner types
Use an audio-visual format to provide a multi-sensual learning experience (visual, aural, written text)
I thought that screen-casts would be a great idea to add another format to my teaching portfolio when teaching German language courses. I had the idea that students at all levels, but especially on lower levels (language levels A2-B1), would like to engage with videos and that this format would make it easier for students to understand and revise class content. Also, the combination of different aspects should help students; the added aural component helps students to familiarise themselves with spoken German which is part of their learning process. They can pause and replay and see how much they understand.
I recorded my video without student involvement because I wanted it to be a reference for students before they set out to create their blogposts. Before producing this video, I tried to create some trial videos to make sure that I was comfortable with the situation and that I had prepared what I was going to say to help stay clear and focused.
We had a workshop at MERL about the museum and the tourists that are attracted to visit it. I put the blog support video online simultaneously to ensure that students who couldn’t attend the workshop would know what to do and that those who were at the MERL would be able to go back to the information, in case there were questions on how to complete the task.
In the video, I went through the step-by-step process of creating a blog post in Blackboard. This way, it was very clear how the task should be completed and which options were available to the students (e.g. add a picture/audio file or a link).
The video was made available to 27 students who were part of the course. Out of these 27, 7 viewed the video anonymously.The longest view time covered the video’s full view time while the shortest lasted for less than half a minute. The viewers were anonymous because I embedded the video into Blackboard and thus the viewer data was not retrieved. This might mean that 7 students watched parts of the video and it helped them with their tasks. Most students completed the task successfully but I cannot tell for sure whether or not that was due to the video as there was no feedback other than the statistics. I think that there might have been a chance to increase views of the video if I had pointed students more often in the direction of it. I made students aware of the video in class and I think that it would have increased views if I had also made one or several Blackboard announcements.
I really enjoyed creating my video and I think the format has a lot of potential. However, I think I would want to invest more time in creating a screen cast videos which I did not have this year due to circumstances. In general, I think the biggest challenges are to plan a video effectively and to record it in one go without too many glitches. The Mediasite tool is easy to use for recording and is capable of some editing, but in order to create a smooth video, I think that it is helpful to have it mapped it out properly beforehand and to dedicate more time to the video creation.
In the future, I will use the video format for preparatory tasks and prepare questions to accompany the videos so that students can engage better with them. Currently, I am preparing audio-only material for students that comes with specific listening tasks which are then part of the personal capture and should help students to improve their listening skills.
Jo Stringer – Henley Business School (Real Estate & Planning)
I created catch-up screen-casts for the first two live sessions of my postgraduate Law module. These enabled students to catch-up live sessions missed. The screen-casts were accessed by 25% of the students and the feedback was very positive. Key finding: the students want more recorded versions of live sessions. They seem satisfied with simple, basic screen-casts plus audio.
Create screen-casts of early module material;
Aim to create screen-casts that:
are concise and engaging;
enable the students to catch-up flexibly but efficiently;
do not just repeat the live lecture. Create a different resource which takes advantage of the online environment.
I teach on the REMF54 module: Property Law with 48 students. Attendance is problematic in the initial weeks of the module. Students are invited to Employer Assessment Centres which clash with live sessions. Since important foundation concepts are covered in the early sessions, I wanted to create an engaging and effective resource to encourage the students to take responsibility for catching-up, in addition to the “static” materials already provided (lecture slides, workshop materials, supplementary course text reading etc.). I also wanted the students to catch-up in “course-time”, rather than leaving it until the end of the module: screen-casts could be a less intimidating/more manageable route into achieving this.
I began by editing the live session slides with the intention of creating bite-sized screen-casts. This proved tricky since the material is complex and hard to prioritise. Ultimately, I decided to create 3 approx. 15 min. screen-casts, dealing with the learning outcomes independently, for each of 2 live sessions. I wrote a script to ensure I retained focus and clarity.
I introduced recap quizzes to keep learning active. Whilst recording, I encouraged the students to pause the recording to write down answers to the quizzes. I revealed the answers at the end of each recording.
No students were involved in the recordings but, as I created the screen-casts after the live delivery, I ensured I dealt with questions and areas of difficulty that arose from class.
I recorded the screen-casts using the slideshow plus audio option and made use of simple edit functions where necessary, mainly cutting and fading in/out before uploading to Blackboard.
The screen-casts were used by 25% of the cohort. Most of the students accessed the screen-casts for Sessions 1 and 2 at the same time, two weeks after Session 1 was delivered live. This was unexpected as I had anticipated the students would have left catching-up until immediately prior to the assessment. The Session 2 screen-casts attracted fewer views than Session 1 and there were drop-off points in the final two Session 2 screen-casts at the mid-point which are not evident for Session 1 (see screenshots below). This may indicate “screen-cast fatigue” which could increase if more were produced.
Student views for the session 1 part 1 screen-cast
Student views for the session 2 part 3 screen-cast
Students’ feedback from evaluation survey – Statements about the screen-casts attracting strong agreement:
allowed students to catch up: 85%;
increased knowledge and understanding: 92%;
control of own learning: 100%.
The feedback indicates that the screen-casts were valued and considered effective. The screen-casts were, in my view, quite rough and ready: it was a challenge to create an engaging product in the time available. I could have made greater use of the online environment by, for example, thinking more explicitly about how I was explaining things to a listener and developing the slides to make them more visual. In any event, this does not appear to have filtered through into the feedback with 76% of students agreeing or strongly agreeing that the screen-casts were engaging.
I have made a screen-cast on a particularly complex topic as an exam resource for my undergraduate module. The question on this topic was the second most popular choice in the exam and the average mark was the second highest rising from 50% last year to 55% this year. I have also used personal capture to record a Powtoon video I created.
Tamara Wiehe, Charlotte Allard & Hayley Scott (PWP Clinical Educators)
Charlie Waller Institute; School of Psychology and Clinical Language
In line with the University’s transition to Electronic Management of Assessment (EMA), we set out to create an electronic Portfolio (e-Portfolio) for use on our Psychological Well-being Practitioner (PWP) training programmes to replace an existing hard-copy format. The project spanned almost 1 year (October 2018- September 2019) as we took the time to consider the implications on students, supervisors in our IAPT NHS services, University administrators and markers. Working closely with the Technology Enhanced Learning (TEL) team led us to a viable solution that has been launched with our new cohorts from September 2019.
Create an electronic Portfolio in line with EMA that overcomes existing issues and improves the experience for students, NHS supervisors, administrators and markers.
Work collaboratively with our all key stakeholders to ensure that the new format satisfies their various needs.
A national requirement for PWPs is to complete a competency-based assessment in the form of a Portfolio that spans across their three modules of their training. Our students are employed by NHS services across the South of England and many live close to their service rather than the University.
The issue? The previous hard-copy format meant that students spent time and money printing their work and travelling to the University to submit/re-submit it. University administrators and markers reported issues with transporting the folders to markers and storing them, especially with the larger cohorts.
The solution… To resolve these issues by transitioning to an electronic version of the Portfolio.
October 2018: An initial meeting with TEL was held in order to discuss the practicalities of an online Portfolio submission.
October 2018 – March 2019: TEL created several prototypes of options for submission via Blackboard including the use of the journal tool and a zip file. Due to practicalities, the course team decided on a single-file word document template.
April – May 2019: Student focus groups were conducted with both programmes (undergraduate and postgraduate) where the same assessment sits to gain their feedback with the potential solution we had created. Using the outcomes of the focus groups and staff meetings, it was unanimously agreed that the proposed solution was a viable option for use with our future cohorts.
June 2019: TEL delivered a training session for staff and admin to become familiar with the process from both student and staff perspective. TEL also created a guidance document for administrators on how to set up the assignment on Blackboard.
July – August 2019: Materials including the template and rubrics were amended and formatted in order to meet requirements for online submission for both MSci and PWP courses. Resources were also created for students to access on Blackboard such as screen casts on how to access, utilise and submit the Portfolio using the electronic format; the aim of this is to improve accessibility for all students participating on the course.
September 2019: Our IAPT services were notified of the changes as the supervisors there are responsible for reviewing and ‘signing off’ on the student’s performance before the Portfolio is submitted to the University for a final check.
Thus far, the project has achieved the objectives it set out to. The template for submission is now available for students to complete throughout their training course. This will modernise the submission process and be less burdensome for the students, supervisors, administrators and markers.
The students in the focus group reported that this would significantly simplify the process and relieve the barriers they often reported with completing and submitting the Portfolio. Currently, there have not been any unexpected outcomes with the development of the Portfolio. However, we aim to review the process with the first online Portfolio submission in June 2020.
Upon reflection, the development of the online Portfolio has so far been a success. Following student feedback, we listened to what would improve their experience of completing the Portfolio. From this we developed an online Portfolio, meeting the requirements across two BPS accredited courses which will be used for future cohorts of students.
Additionally, the collaboration between staff, students and the TEL team, has led to improved communication across teams with new ideas shared; this is something we have continued to incorporate into our teaching and learning projects.
An area to develop for the future, would be to utilise a specific Portfolio software. Initially, we wanted to use a journal tool on Blackboard, however, it was not suitable to meet the needs of the course (most notably exporting the submission and mark sheet to external parties). We will continue to review these options and will continue to gain feedback from future cohorts.
Dr Bolanle Adebola is the Module Convenor and lecturer for the following modules on the LLM Programme (On campus and distance learning):
International Commercial Arbitration, Corporate Governance, and Corporate Finance. She is also a Lecturer for the LLB Research Placement Project.
Bolanle is also the Legal Practice Liaison Officer for the CCLFR.
• To make the assessment criteria more transparent and understandable.
• To improve assessment output and essay writing skills generally.
For the teacher:
• To facilitate assessment grading by setting clearly defined criteria.
• To facilitate the feedback process by creating a framework for dialogue which is understood both by the teacher and the student.
I faced a number of challenges in relation to the assessment process in my first year as a lecturer:
• My students had not performed as well as I would have liked them to in their assessments.
• It was my first time of having to justify the grades I had awarded and I found that I struggled to articulate clearly and consistently the reasons for some of the grades I had awarded.
• I had been newly introduced to the step-marking framework for distinction grades as well as the requirement to make full use of the grading scale which I found challenging in view of the quality of some of the essays I had graded.
I spoke to several colleagues but came to understand that there were as many approaches as there were people. I also discussed the assessment process with several of my students and came to understand that many were both unsure and unclear about the criteria by which their assessments were graded across their modules.
I concluded that I needed to build a bridge between my approach to assessment grading and my students’ understanding of the assessment criteria. Ideally, the chosen method would facilitate consistency and the provision of feedback on my part, and improve the quality of essays on my students’ part.
I tend towards the constructivist approach to learning which means that I structure my activities towards promoting student-led learning. For summative assessments, my students are required to demonstrate their understanding and ability to critically appraise legal concepts that I have chosen from our sessions in class. Hence, the main output for all summative assessments on my modules is an essay. Wolf and Stevens (2007) assert that learning is best achieved where all the participants in the process are clear about the criteria for the performance and the levels at which it will be assessed. My goal therefore became to ensure that my students understood the elements I looked for in their essays; these being the criteria against which I graded the essays. They also had to understand how I decided the standards that their essays reflected. While the student handbook sets out the various standards that we apply in the University, I wanted to provide clearer direction on how they could meet or how I determine that an essay meets any of those standards.
If the students were to understand the criteria I apply when grading their essays, then I would have to articulate them. Articulating the criteria for a well-written essay would benefit both myself and my students. For my students, in addition to a clearer understanding of the assessment criteria, it would enable them to self-evaluate which would improve the quality of their output. Improved quality would lead to improved grades and I could give effect to university policy. Articulating the criteria would benefit me because it would facilitate consistency. It would also enable me to give detailed and helpful feedback to students on the strengths and weaknesses of the essays being graded, as well as on their essay writing skills in general; with advice on how to improve different facets of their outputs going forward. Ultimately, my students would learn valuable skills which they could apply across board and after they graduate.
For assessments which require some form of performance, essays being an example, a rubric is an excellent evaluation tool because it fulfils all the requirements I have expressed above. (Brookhart, 2013). Hence, I decided to present my grading criteria and standards in the form of a rubric.
The rubric is divided into 5 criteria which are set out in 5 rows:
For each criterion, there are 4 performance levels which are set out in columns: Poor, Good, Merit and Excellent. An essay will be mapped along each row and column. The final marks will depend on how the student has performed on each criterion, as well as my perception of the output as a whole.
Studies suggest that a rubric is most effective when produced in collaboration with the students. (Andrade, Du and Mycek, 2010). When I created my rubric, I did not involve my students, however. I thought that would not be necessary given that my rubric was to be applied generally and with changing cohorts of students. Notwithstanding, I wanted students to engage with it. So, the document containing the rubric has an introduction addressed to the students, which explains the context in which the rubric has beencreated. It also explains how the rubric is applied and the relationship between the criteria. It states for example, that ‘even where the essay has good arguments, poor structure may undermine its score’. It explains that the final grade combines but objective assessment and a subjective evaluation of the output as a whole which is based on the marker’s discretion.
To ensure that students are not confused about the standards set out in the rubric and the assessment standards set out in the students’ handbook, the performance levels set out in the rubric are mapped against the assessment standards set out in the student handbook. The document containing the rubric also contains links to the relevant handbook. Finally, the rubric gives the students an example of how it would be applied to an assessment. Thereafter, it sets out the manner in which feedback would be presented to the students. That helps me create a structure in which feedback would be provided and which both the students and I would understand clearly.
My students’ assessment outputs have been of much better quality and so have achieved better grades since I introduced the rubric. In one of my modules, the average grade, as recorded in the module convenor’s report to the external examiner (MC’s Report), 2015/16, was 64.3%. 20% of the class attained distinctions, all in the 70-79 range. That year, I struggled to give feedback and was asked to provide additional feedback comments to a few students. In 2016/17, after I introduced the rubric, there was a slight dip in the average mark to 63.7%. The dip was because of a fail mark amongst the cohort. If that fail mark is controlled for, then the average percentage had crept up from 2015/16. There was a clear increase in the percentage of distinctions, which had gone up to
25.8% from 20% in the previous year. The cross-over had been
from the students who had been in the merit range. Clearly, some students had been able to use the rubric to improve the standards of their essays. I found the provision of feedback much easier in 2016/17 because I had clear direction from the rubric. When giving feedback I explained both the strengths and weaknesses of the essay in relation to each criterion. My hope was that they would apply the advice more generally across other modules as the method of assessment is the same across board. In 2017/18, the average mark for the same module went up to 68.84%. 38% of the class attained distinctions; with 3% attaining more than 80%. Hence, in my third year, I have also been able to utilise step-marking in the distinction grade which has enabled me to meet the university’s policy.
When I introduced the rubric in 2016/17, I had a control module, by which I mean a module in which I neither provided the rubric nor spoke to the students about their assessments in detail. The quality of assessments from that module was much lower than the others where the students had been introduced to the rubric. In that year, the average grade for the control module was 60%; with 20% attaining a distinction and 20% failing. In 2017/18, while I did not provide the students with the rubric, I spoke to them about the assessments. The average grade for the control module was 61.2%; with 23% attaining a distinction. There was a reduction in the failure rate to 7.6%. The distinction grade also expanded, with 7.6% attaining a higher distinction grade. There was movement both from the failure grade and the pass grade to the next standard/performance level. Though I did not provide the students with the rubric, I still provided feedback to the students using the rubric as a guide. I have found that it has become ingrained in me and is a very useful tool for explaining the reasons for my grades to my students.
From my experience, I can assert, justifiably, that the rubric has played a very important role in improving the students’ essay outputs. It has also enabled me to improve my feedback skills immensely.
I have observed that as the studies in the field argue, it is insufficient merely to have a rubric. For the rubric to achieve the desired objectives, it is important that students actively engage with it. I must admit, that I did not take a genuinely constructivist approach to the rubric. I wanted to explain myself to the students. I did not really encourage a 2-way conversation as the studies encourage and I think this affected the effectiveness of the rubric.
In 2017/18, I decided to talk the students through the rubric, explaining how they can use it to improve performance. I led them through the rubric in the final or penultimate class. During the session, I explained how they might align their essays with the various performance levels/standards. I gave them insights into some of the essays I had assessed in the previous two years; highlighting which practices were poor and which were best. By the end of the autumn term, the first module in which I had both the rubric and an explanation of its application in class saw a huge improvement in student output as set out in the section above. The results have been the best I have ever had. As the standards have improved, so have the grades. As stated above, I have been able to achieve step-marking in the distinction grade while improving standards generally.
I have also noticed that even where a rubric is not used but the teacher talks to the students about the assessments and their expectations of them, students perform better than where there is no conversation at all. In 2017/18, while I did not provide the rubric to the control-module, I discussed the assessment with the students, explaining practices which they might find helpful. As demonstrated above, there was lower failure rate and improvement generally across board. I can conclude therefore that assessment criteria ought to be explained much better to students if their performance is to improve. However, I think that having a rubric and student engagement with it is the best option.
I have also noticed that many students tend to perform well; in the merit bracket. These students would like to improve but are unable to decipher how to do so. These students, in particular, find the rubric very helpful.
In addition, Wolf and Stevens (2007) observe that rubrics are particularly helpful for international students whose assessment systems may have been different, though no less valid, from that of the system in which they have presently chosen to study. Such students struggle to understand what is expected of them and so, may fail to attain the best standards/performance levels that they could for lack of understanding of the assessment practices. A large proportion of my students are international, and I think that they have benefitted from having the rubric; particularly when they are invited to engage with it actively.
Finally, the rubric has improved my feedback skills tremendously. I am able to express my observations and grades in terms well understood both by myself and my students. The provision of feedback is no longer a chore or a bore. It has actually become quite enjoyable for me.
On publishing the rubric to students:
I know that blackboard gives the opportunity to embed a rubric within each module. I have only so far uploaded copies of my rubric onto blackboard for the students on each of my modules. I have decided to explore the blackboard option to make the annual upload of the rubric more efficient. I will also see if the blackboard offers opportunities to improve on the rubric which will be a couple of years old by the end of this academic year.
On the Implementation of the rubric:
I have noted, however, that it takes about half an hour to explain the rubric to students for each module which eats into valuable teaching time. A more efficient method is required to provide good assessment insight to students. This Summer, I will liaise with my colleagues, as the examination officer, to discuss the provision of a best practice session for our students in relation to their assessments. At the session, students will also be introduced to the rubric. The rubric can then be paired with actual illustrations which the students can be encouraged to grade using its content. Such sessions will improve their ability to self-evaluate which is crucial both to their learning and the improvement of their outputs.
• K. Wolf and E. Stevens (2007) 7(1) Journal of Effective Teaching, 3. https://www.uncw.edu/jet/articles/vol7_1/Wolf.pdf
• H Andrade, Y Du and K Mycek, ‘Rubric-Referenced Self- Assessment and Middle School Students’ Writing’ (2010) 17(2) Assessment in Education: Principles, Policy &Practice, 199 https://www.tandfonline.com/doi/pdf/10.1080/09695941003 696172?needAccess=true
• S Brookhart, How to Create and Use Rubrics for Formative Assessment and Grading (Association for Supervision & Curriculum Development, ASCD, VA, 2013).
• Turnitin, ‘Rubrics and Grading Forms’ https://guides.turnitin.com/01_Manuals_and_Guides/Instru ctor_Guides/Turnitin_Classic_(Deprecated)/25_GradeMark
• Blackboard, ‘Grade with Rubrics’ https://help.blackboard.com/Learn/Instructor/Grade/Rubrics
• Blackboard, ‘Import and Export Rubrics’ https://help.blackboard.com/Learn/Instructor/Grade/Rubrics