Managing transition to the MPharm Degree

Dr John Brazier, Chemistry, Food and Pharmacy
j.a.brazier@reading.ac.uk

Overview

Image of students smiling and learning The MPharm degree at the University of Reading has a diverse student cohort, in terms of both ethnicity and previous academic experience. During the most recent development of our programme, we have introduce a Part One assessment strategy that is focused on developing an independent learning approach.

Objectives

  • To use a formative assessment strategy to encourage independent learning.
  • To use timetabling to ease the transition to higher education.
  • To reduce students’ fixation on their grades, and encourage them to instead focus on feedback.

Context

It was clear from Part Two results that our students were not progressing from Part One with the necessary knowledge and skill set to succeed on the MPharm course. The ability to pass Part One modules while underperforming in exams was identified as a key issue. The reliance of the students on standard information provided during lectures, and the inability to study outside of this standard information was impacting on students’ final grades.

Implementation

When designing our programme, we introduced a requirement to not only pass each module at 40%, but also to pass each examination with a mark of at least 40%. It was felt that this would ensure that students in Part Two would be equipped with the basic knowledge to succeed, and allow them to concentrate on developing the higher level skills required for Parts Three and Four, rather than having to return to Part One material due to their lack of knowledge. The requirement to pass the examination with a mark of at least 40% was a challenge; therefore we developed a formative/diagnostic assessment strategy to support the students throughout the year. In order to ease the transition from further education to university level, we designed a timetable that initially required students to attend teaching sessions intensively for the first five weeks, but then reduced gradually over the following four weeks and terms. This would allow us to direct their learning during the first few weeks of term, and then allow time for them to develop their independence once familiar with university life. Diagnostic and formative assessment points were spaced throughout the two teaching terms, starting with in-class workshops and tutorials and online Blackboard tests. Towards the end of the Autumn term, the students were given an open book mock examination followed by an opportunity to mark their work with direction from an academic. This approach continued in the Spring term, and culminated in a full two-hour mock examination at the end of the Spring term which was marked and returned with feedback before the end of the term.

Impact

As suspected, the level of progression at first attempt was considerably lower than desired, with a high number of students failing the examined component. With resits, the number that failed to progress was much lower, and attrition rates for this cohort at Part Two substantially lower still. Forcing the students to gain a high baseline of knowledge and understanding in Part One piut them in a better position for Part Two, and the high pass rate at Part One resits showed the students must have developed some independent learning skills, as they did not have access to direct teaching between the period of the main exams and the resits.

Reflections

The main issue now facing us is the high number of students failing to progress at first attempt. We believe this is due to a combination of poor attendance and engagement from the Part One students, along with a lack of understanding about developing independent study skills. Although we expect students to develop independence with their learning, it is clear that some do not understand what this means, or how to approach their studies. Once the students pass Part One they continue to do well at Parts Two and Three, but we need to address the issues with progression at Part One.

Follow up

In order to improve our pass rate at Part One, we plan to develop a more robust process to identify and support students who are failing to engage with the course. This will be through comprehensive attendance monitoring and follow up by personal tutors, along with clear communication about expectations and independence. Students will initially get guidance on what they should have covered during timetabled teaching sessions, along with suggested independent work. As the year progresses, this guidance will become less detailed in order to further promote independence.

Using personal capture to support students to learn practical theory outside of the laboratory

Dr Geraldine (Jay) Mulley – School of Biological Sciences  

Overview

I produced four screen casts to encourage students to better prepare for practical classes and to reinforce practical theory taught in class. Approximately 45% of the cohort watched at least some of the video content, mainly in the few days leading up to the practical assessment. The students appreciated the extra resources, and there was a noticeable improvement in module satisfaction scores.

Objectives

  • To provide consistency in delivery of teaching practical theory between groups led by different practical leaders
  • To provide students with engaging resources to use outside of the classroom, to use as preparation tools for practical classes and as revision aids for the Blackboard-­‐based practical assessment

Context

The Part 1 Bacteriology & Virology module includes 12 hours of practical classes designed to teach students key microbiological techniques and theory. I usually begin each practical with a short lecture-­‐style introduction to explain what they need to do and why.  The 3 hr classes are typically very busy, and I have observed that some students feel overwhelmed with “information overload” and find it hard to assimilate the theory, whilst learning the new techniques.  I have had to schedule multiple runs of practical classes to accommodate the large cohort and my colleagues now teach some of the repeat sessions. My aim was to create a series of videos to explain the theoretical background in more detail that students can access outside of the classroom. I hoped this would ensure consistency in what is taught to each group and give the students more time to focus on learning the techniques during the classes. I hoped that they would use the resources both to help prepare for the classes and as a revision aid for the practical assessment

Implementation

I initially tried to record 4 videos by simply recording myself talking through my original PowerPoint presentations that I use in the practical class introductions (i.e. 4 individual videos to cover each of the 4 practical classes). Having started to make the videos, I realised that it was very difficult for me to explain the theory in this format, which was quite surprising given this is how I had been delivering the information up until that point! I therefore adapted the PowerPoint presentations to make videos focusing on each of the experimental themes, talking through what the students will do in the lab week-­‐by-­‐week with an explanation of the theory at appropriate points. I recorded the video tutorials using the Mediasite “slideshow + audio” option and narrated free-­‐style as I would do in a lecture (no script).  When I made a mistake, I paused for a few seconds and then started the sentence again. After finishing the entire recording, I then used the editing feature to cut out the mistakes, which were easy to identify in the audio trace due to the long pauses. I was also able to move slides to the appropriate place if I had poorly timed the slide transitions. Editing each video took around 30 min to 1 hr. I found it relatively easy to record and edit the videos and I became much more efficient after I had recorded the first few videos.

I would have liked to have asked students and other staff to help in the design and production of the videos, but the timing of the Pilot was not conducive to being able to collaborate at the time.

Impact

Mediasite analytics show 45% of the students in the cohort viewed at least some of the resources, and 17% of the cohort viewed each video more than once. Students watched the three shorter videos (3 – 4 min) in their entirety, but the longest video (18 min) showed a drop-­‐off in the number of views after approx. 5 min (Figure 1), and so in future I will limit my videos to 5 min max.

Graph showing how students watched the video

Only a few students viewed videos prior to practical classes; almost all views were in the few days leading up to the practical assessment on Blackboard. This shows that students were using the videos as a revision aid rather than as a preparation tool. This is probably because I uploaded the videos midway through term and by this stage one of the three groups had already completed the 4 practical classes and so I did not want to disadvantage this group by promoting the videos as a preparation tool. It will be interesting if I can encourage students to use it for this purpose next academic year. My expectation was that time spent viewing would directly correlate with practical assessment grades, however there is not a clear linear correlation (Figure 2).

Graph showing use of videos and grades obtained

For some students attending the practical classes and reading the handbook is enough to achieve a good grade. However, students that spent time viewing the videos did get a higher average than those that did not view any (Figure 3), although this probably reflects overall engagement with all the available learning resources.  Responses to the student survey indicated that students felt the videos improved their understanding of the topic and supported them to revise what they had learnt in class at their own pace.

Graph showing video watching and grades obtained

Reflections

The biggest challenge I faced was trying to recruit other colleagues to the pilot during a very busy Autumn term and finding the time to design the videos myself. It would have been helpful to see some examples of how to use personal capture before I started but having participated in the Pilot, I now have more confidence. Once I had experimented with the Mediasite software, I found it quite easy to record the videos and publish to my Blackboard site (with guidance from the excellent support from the TEL team and Blackboard help web pages). I liked the editing tools, although I would very much like the ability to cut and paste different videos together.  The analytics are very useful and much better than the “track users” function in Blackboard. The analytics reinforced the suggestion that students are much more likely to finish watching short videos and I would advise making videos 5 min maximum, ideally 3 min, in length.    My experience of personal capture was incredibly positive, and I will certainly be making more resources for my students for all my modules.

Follow-up

Since making the recordings for the Pilot, I have teamed up with several colleagues in the School of Biological Sciences and will show them how to use Mediasite so that they can make resources for their modules over summer. I have also used the Mediasite software to record microscope training sessions and talks from open days.

Building bridges and smoothing edges

Patrick Finnegan – School of Economics, Politics & International Relations

Overview

My use of the personal capture scheme was intended to enhance our teaching methods within the department. My initial aims of building additional video capture material into the ongoing lecture series did not come through but I was able to use the capture package to engage my students more in the administration of a (then) overly complicated module.

Objectives

  • Initial plan centred on including personal capture on the Army Higher Education Pathway project – this was not possible due to software incompatibility with the Canvas platform used for the project
  • New objectives were based on a different module (The Study of Politics) and improving the student experience on that module
  • Improve the explanation of methods
  • Explain the supervisory choice system
  • Enhance lectures on complicated topics

Context

The module I focused on was Po2SOP (The Study of Politics) with 160 students. Personal capture was needed on this project as it allowed myself, as convenor of our largest module, to communicate with all of my students in a more engaging way. We needed a way to bring the topic to life and ensure that the students took on board the lessons we needed them to. I wanted to include real examples of the methods in action and to use the screen casts to explain certain decisions that would be too difficult to do via email.

Implementation

Unfortunately, the project began too late in the term to really affect the lectures on this module, which is co-taught between several staff members often using pre-existing slides. However, I was able to use it to engage in discussion with students to explain issues such as supervisor reallocation during the year and how our special event – the mini-conference – was to work. Rather than writing lengthy emails, I was able to quickly and visually explain to he students what was happening and to invite their responses, which some did. They did not engage with the capture material so to speak but my use of it did encourage discussion as to how they would like to see it used in future and how they would like to receive feedback on assessments in future if audio/visual options were available. The recordings made by myself and my colleague were mainly PowerPoint voice-overs or were direct to camera discussions. This allowed us to present the students with illustrations and ‘first hand’ information. These required significant editing to make sure they were suitable but the final product was satisfactory.

Impact

Beyond ‘ease of life’ effects this year, there was not a great deal of impact but this was expected given the start date (the largest number of views in a video was 86, but this was an admin explanation style video). However, planning for next year has already incorporated the different potential advantages provided by personal capture. For example, the same methods module will now incorporate tutorial videos made within the department and will maintain some supervisor ‘adverts’ to allow students to better choose which member of staff they will seek to work with in future. Within other modules, some staff members will be taking the opportunity to build in some flipped classroom style teaching and other time-heavy elements that were not previously available to them.

Reflections

Time needed to organise and direct co-pilots within a teaching-heavy department needed to be a lot greater than I originally planned. I was also not expecting to meet the levels of resistance that I did from some more established staff who were not interested in changing how they delivered the material they had prepared earlier. The major difference I would include going forward would be to focus on upcoming modules rather than pre-existing as incorporating the material when the module has already started was too difficult.

Follow-up

I have started to prepare some videos on material I know will be needed in the future, this is relatively straight forward to do and will mimic the general practice to date. The main evolution will be seen in responses to student need during class and how screen casts can be made on demand and with consistent quality.

Creating screencast videos to support and engage post-graduate students

Sue Blackett – Henley Business School, 2018-19

Image of Sue Blackett

Overview

I participated in the university’s Personal Capture pilot as a Champion for my school to trial the Mediasite tool to create screen cast videos for use in teaching and learning. My aim was to help PGT students get to grips with key elements of the module. The videos facilitated students in repeatedly viewing content with the aim of increasing engagement with the module. Some videos were watched multiple times at different points throughout the term indicating that information needed to be refreshed. 

Objectives

  1. To connect with the cohort and establish module expectations. 
  2. Reduce class time taken up with module administration. 
  3. Provide coursework feedback in an alternative form and reinforce its feedforward use for the final exam. 
  4. To provide exam revision advice and highlight areas of focus. 
  5. Support students with weaker English language skills. 
  6. Provide module materials in a reusable, accessible and alternative form. 

Context

The target audience was students on ACM003 Management Accounting Theory & Practice, a postgraduate course where 91% of students were native Mandarin speakers. English language skills were an issue for some students, so capture video provided opportunities for students to re-watch and get to grips with the content at their leisure. In addition, I wanted to free up class contact time so I could focus on content in areas that had been more challenging on the previous run of the module. Also, by using different colours and font sizes on the PowerPoint slides, the visual emphasis of key points reinforced the accompanying audio. 

Implementation

The first video recorded was a welcome to the module video (slides and audio only) that covered the module administration i.e. an overview of module, outline of assessment, key dates, module text book etc. The content for the video was relatively straightforward as it was taken out of the first lecture’s slides. By isolating module admin information, more information could be added e.g. mapping assessable learning outcomes to assessments and explaining the purpose of each type of assessment. In first recording the video, I did not follow a script as I was trying to make my delivery sound more natural. Instead, I made short notes on slides that needed extra information and printed off the presentation as slides with notes. As this is the same strategy that I use to deliver lectures, I was less concerned about being “audio ready” i.e. not making errors in my voice recording. 

 In the second and third videos (coursework feedback and exam revision advice), I included video of myself delivering the presentations. As the recordings were made in my home office, additional visual matters had to be considered. These included: what I was wearing, the background behind me, looking into the camera, turning pages, etc. The second attempts of each recording were much more fluent and therefore uploaded to Blackboard. 

 The last two recordings were quite different in nature. The coursework feedback used visuals of bar charts and tables to communicate statistics accompanied by audio that focused on qualitative feedback. The exam revision video used lots narrative bullet points. 

Examples of my videos:

Welcome to module: https://uor.mediasite.com/Mediasite/Play/7a7f676595c84507aa31aafe994f2f071d

Assessed coursework feedback: https://uor.mediasite.com/Mediasite/Play/077e974725f44cc8b0debd6361aaaba71d

Exam revision advice: https://uor.mediasite.com/Mediasite/Play/94e4156753c848dbafc3b5e75a9c3d441d

Resit Exam Advice: https://uor.mediasite.com/Mediasite/Play/e8b88b44a7724c5aa4ef8def412c22fd1d

Impact

The welcome video did have impact as it was the only source of information about the administration for the course. When students arrived at the first class with the text book, this indicated that they had been able to access the information they needed to prepare for the course. Student response to the personal capture pilot project questionnaire was low (18%), however, the general feedback was that the videos were useful in supporting them during the course. 

 Analysis of analytics via MediaSite and Blackboard provided some very interesting insights: 

  1. Most students did not watch the videos as soon as they were released. 
  2. Some of the videos were watched multiple times throughout the term by weaker and stronger students. 
  3. Some students were not recorded as having accessed the videos. 
  4. Students were focused for the first 20 – 60 seconds of each video and then skipped through the videos. 
  5. Few students watched the videos from start to finish i.e. the average time watched for the 4 min 49 secs welcome video was 2 min 10 secs. The coursework feedback video was 9 mins 21 secs, however, average viewing time was 3 mins 11 secs. The revision video followed the same trend being 8 mins 41 secs long with an average watching time of 2 mins 55 secs.
     

Review of video along with watching trends showed that students skipped through the videos to the points where slides changed. This suggested that the majority were reading the slides rather than listening to the accompanying commentary which contained supplementary information. 

 As no student failed to meet the admin expectations of the course, those that had not watched the video must have been informed by those who had. 

Reflections

The analytics were most illuminating. Me appearing in videos was supposed to establish bonds with the cohort and increase engagement, however, my appearance seemed to be irrelevant as the students were focused on reading rather than listening. This could have been due to weaker listening skills but also highlights that students might think that all important information is written down rather than spoken.  

 Videos with graphics were more watched than those without so my challenge will be to think about what content I include in slides i.e. more graphics with fewer words and/or narrative slides with no audio. 

 I will continue with capture videos, however, I will do more to test their effectiveness, for example I will design in-class quizzes using KahootMentimeter, etc. to test whether the content of the videos has been internalised. 

Follow-up

I’ve become much quicker at designing the PowerPoint content and less worried about stumbling or searching for the right words to use. I have been able to edit videos more quickly e.g. cutting out excessive time, cropping the end of the video. Embedding videos in Blackboard has also become easier the more I’ve done  it. The support information was good, however, I faced  a multitude of problems that IT Support had to help me with, which, if I’m honest, was putting me off using the tool  (I’m a Mac user mostly using this tool off campus).  

 

Making full use of grademark in geography and environmental science – Professor Andrew Wade

 

Profile picture for Prof. Andrew Wade

Professor Andrew Wade is responsible for research in hydrology, focused on water pollution, and Undergraduate and Postgraduate Teaching, including Hydrological Processes

OBJECTIVES

Colleagues within the School of Archaeology, Geography and Environmental Sciences (SAGES) have been aware of the University’s broader ambition to move towards online submission, feedback and grading where possible. Many had already made the change from paper based to online practices and others felt that they would like the opportunity to explore new ways of providing marks and feedback to see if handling the process online led to a better experience for both staff and students.

CONTEXT

In Summer 2017 it was agreed that SAGES would become one of the Early Adopter Schools working with the EMA Programme. This meant that the e Submission, Feedback and Grading work stream within the Programme worked very closely with both academic and professional colleagues within the School from June 2017 onwards. This was in order to support all aspects of a change from offline to online marking and broader processes for all coursework except where there was a clear practical reason not to, for example, field note-books.
I had started marking online in 2016-2017 so was familiar with some aspects of marking tools and some of the broader processes.

IMPLEMENTATION

My Part 2 module, GV2HY Hydrological Processes, involves students producing a report containing two sections. Part A focuses on a series of short answers based on practical-class experiences and Part B requires students to write a short essay. I was keen to use all of the functionality of Grademark/Turnitin during the marking process so I spent time creating my own personalised QuickMark bank so that I could simply pull across commonly used feedback phrases and marks against each specific question. This function was particularly useful to use when marking Part A. I could pull across QuickMarks showing the mark and then, in the same comment, explain why the question received, for example, 2 out of a possible 4 marks. It was especially helpful that my School sent around a discipline specific set of QuickMarks created by a colleagues. We could then pull the whole set or just particular QuickMarks into our own personalised set if we wanted to. This reduced the time spend on personalising and meant that the quality of my own set was improved further.

I also wanted to explore the usefulness of rubric grids as one way to provide feedback on the essay content in Part B of the assignment. A discipline specific example rubric grid was created by the School and send around to colleagues as a starting point. We could then amend this rubric to fit our specific assessment or, more generally, our modules and programmes. The personalised rubrics were attached to assignments using a simple process led by administrative colleagues. When marking I would highlight the level of performance achieved by each student, against each criteria by simply highlighting the box in blue. This rubric grid was used alongside both QuickMarks and in text comments in the essay. More specific comments were given in the blank free text box to the right of the screen.

IMPACT

Unfortunately module evaluation questionnaires were distributed and completed before students received feedback on their assignments so the student reaction to online feedback using QuickMarks, in text comments, free text comments and rubrics was not captured.

In terms of the impact on the marker experience, after spending some initial time getting my personal Quickmarks library right and amending the rubric example to fit with my module, I found marking online easier and quicker than marking on paper.

In addition to this, I also found that the use of rubrics helped to ensure standardisation. I felt comfortable that my students were receiving similar amounts of feedback and that this feedback was consistent across the cohort and when returning to marking the coursework after a break. When moderating coursework, I tend to find more consistent marking when colleagues have used a rubric.
I also felt that students received more feedback than they usually might but am conscious of the risk that they that drown in the detail. I try to use the free text boxes to provide a useful overall summary to avoid overuse of QuickMarks.

I don’t worry now about carrying large amounts of paper around or securing the work when I take assignments home. I also don’t need to worry about whether the work I’m marking has been submitted after the deadline – under the new processes established in SAGES, Support Centre colleagues deduct marks for late submission.

I do tend to provide my cohorts with a short piece of generic feedback, including an indicator of how the group performed-showing the percentage of students who had attained a mark in each class. I could easily access this information from Grademark/Turnitin.

I’m also still able to work through the feedback received by my Personal Tutees. I arrange individual sessions with them, they access ‘My Grades’ on Blackboard during this meeting and we work through the feedback together.

One issue was that, because the setting were set up in a particular way, students could access their feedback as soon as we had finished writing it. This issue was identified quickly and the settings were changed.

REFLECTIONS

My use of online marking has been successful and straightforward but my experience has been helped very significantly by the availability of two screens in my office. These had already been provided by School but became absolutely essential. Although I largely mark in my office on campus, when I mark from home I set up two laptops next to each other to replicate having two screens. This set up allows me to be able to check the student’s work on one screen whilst keeping their coursework on the other.

One further area of note is that the process of actually creating a rubric prompted a degree of reflection over what we actually want to see from students against each criteria and at different levels. This was particularly true around the grade classification boundaries-what is the different between a high 2:2 and a low 2:1 in terms of each of the criteria we mark against and how can we describe these differences in the descriptor boxes in a rubric grid so that students can understand.

This process of trying to make full use of all of the functions within our marking tools has led to some reflection surrounding criteria, what we want to see and how we might describe this to students.

LINKS

For more information on the creation and use of rubrics within Grademark/Turnitin please see the Technology Enhanced Learning Blog pages here:
http://blogs.reading.ac.uk/tel/support-blackboard/blackboard-support- staff-assessment/blackboard-support-staff-turnitin/turnitin-rubrics/

Introducing online assessment in IFP modules – Dr Dawn Clarke

OBJECTIVES

Colleagues within the IFP wanted to improve the student assessment experience. In particular we wanted to make the end to end process quicker and easier and reduce printing costs for students. We also wanted to offer some consistency with undergraduate programmes. This was particularly important for those students who stay in Reading after their foundation year to undertake an undergraduate degree. We were also keen to discover if there would be any additional benefits or challenges which we had not anticipated.

CONTEXT

No IFP modules had adopted online submission, grading and feedback until Spring 2015. We were aware of a number of departments successfully running online assessment within the University and the broader move towards electronic management of assessment within the sector as a whole. We introduced online assessment for all written assignments, including work containing pictures and diagrams, onto the IFP module ‘Politics’ (PO0POL) and ‘Sociology’ (PO0SOC) in 2015.

IMPLEMENTATION

We made the decision very early in the process that we would use Turnitin Grademark within Blackboard Gradecenter. This was consistent with existing use in the Department of Politics.
We created a set of bespoke instructions for students to follow when submitting their work and when viewing their feedback. These instructions were based on those provided by the Technology Enhanced Learning Team but adjusted to fit our specific audience. These were distributed in hard copy and we spent some time in class reviewing the
process well before the first submission date.

Submission areas in Blackboard and standard feedback rubric sections were created by the Departmental Administrator who was already highly experienced.

IMPACT

Overall the end to end assessment process did become easier for students. They didn’t have to travel to campus to submit their assignments and they enjoyed instant access to Turnitin.
Turnitin itself became a very useful learning tool for pre degree foundation students. It not only provided initial feedback on their work but prompted a dialogue with the marker before work was finally submitted. For students right at the start of their university experience this was extremely useful.

It was equally useful to automate deadlines. Students very clearly understood the exact time of the deadline. The marker was external to this process allowing them to adopt a more neutral position. This was more transparent than manual systems and ensured a visibly consistent experience for all students.

In addition to this, because students did not have to print out their assignments, they became much more likely to include pictures and diagrams to illustrate their work. This often improved the quality of submission.

All students uploaded their essays without any additional help. A small number also wanted to upload their own PowerPoint presentations of their in class presentations at the same time which meant that we needed to work through the difficulty of uploading two files under one submission point.

Moving to online assessment presented a number of further challenges. In particular, we became aware that not all students were accessing their feedback. Arranging online access for external examiners in order to moderate the work presented a final challenge. We then worked to address both of these issues.

REFLECTIONS

It would be really helpful to explore the student experience in more depth. One way to do this would be to include a section specifically focused on feedback within IFP module evaluation forms.
In the future we would like to make use of the audio feedback tool within Gradecenter. This will maximise the experience of international
students and their chances of developing language skills.

Using Quickmarks to enhance essay feedback in the department of English Literature – Dr Mary Morrissey

Within the department, I teach primarily in Early Modern and Old English. For more details of my teaching please see Mary Morrissey Teaching and Convening

My primary research subject is Reformation literature, particularly from London. I am particularly interested in Paul’s Cross, the most important public pulpit in sixteenth and seventeenth-century England. I retain an interested in early modern women writers, with a particular focus on women writers’ use of theological arguments. Further details of my research activities can be found at Mary Morrissey Research

OBJECTIVES

A number of modules within the Department of English Literature began using GradeMark as a new marking tool in the Autumn of 2015. I wanted to explore the use of the new QuickMarks function as a way of enhancing the quality of the feedback provided to our students and ensuring the ‘feedback loop’ from general advice on essay writing to the feedback on particular pieces of assessed work was completed.

CONTEXT

The Department developed extensive guidance on writing skills to support student assessment: this includes advice on structuring an argument as well as guidance on grammar and citations. This guide was housed on departmental handbooks and in the assignments folder in Blackboard. There was considerable concern that this resource was underused by students. We did know that the QuickMarks function was being used as part of our online feedback provision and that it was possible to personalise the comments we were using and to add links to those comments as a way of providing additional explanation to students.

IMPLEMENTATION

In order to allow relevant sections of the essay writing style guide to be accessed via QuickMarks I copied the document into a Google Doc, divided each section by using Google Doc bookmarks and assigned each bookmark an individual URL link. I then used Bitly.com to shorten the URL link assigned to each section by the Google Doc to make it more useable. I then created a set of Quickmarks that included these links to the Style Guide. In this way, students had direct access to the relevant section of the Guide while reading their feedback. So if a student hadn’t adopted the correct referencing format (the Modern Humanities Research Association style in the case of English Literature) the marker would pull a QuickMark across to the relevant point of the essay. When the student hovered over this comment bubble, they would see the text within it but were also able to click on the URL taking them directly to page 7 of the departmental writing style guide on MHRA citation and referencing. If other colleagues wanted to start adopting the same approach, I simply exported the QuickMark set to them which they incorporated into their own QuickMarks bank within seconds.

IMPACT

The Bitly.com tool, used to shorten the URL link, monitored the usage of each link included in our QuickMarks. This showed us how many times and on which date each individual link was used.

To complement this data I also ran a survey on the student response to online marking and feedback. 35 undergraduate students responded. This showed that students found feedback most useful when it came in forms that were familiar from paper marking, like general comments on the essay and marginal comments throughout the essay. Less familiar types of feedback (links to web-resources included in bubble comments accessed by hovering the cursor) were often missed. In the survey, 28 out of 35 students said that they did not receive any links to the writing style guide within their QuickMark comments even though more than this did receive them. 3 students did not click on the links. Of the 5 remaining students who did make use of the links, 3 responded positively, mentioning their value in terms of improving their writing skills:

“It was good to refer to alongside my work”
“They helped me to strengthen my writing overall”
“Yes motivational to actually look at them-whereas on a paper copy you might read he comment and forget but here you can click straight through so much easier!”

REFLECTIONS

Some of the new functions available to us on GradeMark allow us to improve our feedback. We shouldn’t just be using online marking tools to replicate existing off line marking processes. We can go much further! But if this is going to be successful it is really important to inform students about the range of options that online marking makes available so that they make the most of the systems we use.

Once we do this effectively, we can then explore other options. In English Literature, we are keen to ensure that our Department style guide is used effectively. But there are many other web resources to which we could link through Quickmarks: screencast essay writing guides in Politics and IWLP, as well as the new Academic Integrity toolkit by Study Advice, for example.

By including links within QuickMark comments we help to move students towards greater levels of assessment literacy.

LINKS

Academic Integrity Toolkit

http://libguides.reading.ac.uk/academicintegrity

Examples of assessment support screencasts created by colleagues

Screencast bank

Study Support Screencast Suite

https://www.reading.ac.uk/library/study-advice/guides/lib-sa- videos.aspx

Bitly URL shortener and link management platform

https://bitly.com/

MOVING TOWARDS E-ASSESSMENT: The Use of Electronic Submission and Grading on the Academic Skills Module – Svetlana Mazhurnaya

Profile picture for Svetlana Mazhurnaya

I have been teaching English for over 15 years. I worked on EFL courses in Russia and the UK between 2000 – 2012. I started teaching English for Academic Purposes in 2013 when I joined the International Foundation Programme at the University of Surrey. I have been working as an EAP tutor at the University of Reading since 2014, first on the International Foundation Programme and now on the Pre-sessional English Programme. I have recently become part of the assessment group within ISLI, which creates and administers tests of EAP.

OBJECTIVES

• To familiarise Foundation level students with e-assessment practices as part of their preparation for Undergraduate Courses at UoR
• To simplify assessment administration procedure for multiple module subgroups with varied deadlines on a 20-credit module
• To reduce the marking workload associated with paper submissions
• To deliver more timely and accessible feedback to students

CONTEXT

The International Foundation Programme has a 15-module portfolio delivered by various UoR departments. International Students joining the course have to manage multiple assessment deadlines and follow academic assessment practices used within the departments delivering their core modules. In order to support them, IFP runs a 20-credit Academic Skills module taught over 2 hours per week and assessed through a combination of formative and summative oral and written assignments marked off-line. A combination of word documents, excel spreadsheet and online RISIS reports are used for assessment data administration. When I joined the programme in 2015, the team were looking for ways of:

• optimising the administration of a large volume of paper submissions with multiple sub-group deadlines

• reducing the tutors’ marking workload & simplifying the assessment data entry process

• gauging the level of learners’ engagement with feedback

IMPLEMENTATION

The trial

• Having previously used electronic marking tools, I was keen to introduce them on the IFP. With the Module Convenor’s support, I started trialling the Turnitin e-submission and grading tools with my sub-group in spring 2015. It was agreed that a formative assessment piece would be suitable for the trial to allow space for an error and that learner training could be integrated into the module syllabus as part of developing the students’ referencing and source integration skills. There were 3 classroom demonstrations: how to submit work, how to check originality reports and how to access electronic feedback. Learners were also signposted to the learner training resources available on Blackboard. Some students requested further guidance and were supported through a peer-led demonstrations in subsequent lessons. The fact that most students managed the e-submission with minimal training was an encouraging start.

• For the purposes of maintaining consistency in feedback delivery with other module subgroups I created a QuickMarks set based on the existing module error correction codes that all of the tutors used and hyper-linked them to the online practice materials we normally recommend to students when suggesting areas for improvement. I also uploaded our mark scheme as a Turnitin rubric. Similarly, I provided global feedback comments on submitted work. The only difference in the feedback delivery was its online mode and the fact that QuickMarks were associated with one of the 5 assessment criteria such as “organization” or “task completion”, hopefully making the rationale behind the grading more explicit.

• Students reacted favourably to receiving electronic feedback, saying that they liked having instant access to their grades through “My Grades” feature and that word-processed comments were easier to understand for international students than handwritten ones. They also like the fact that QuickMarks we use are hyperlinked to external practice materials. This allows them to work independently. For example, a comment on referencing issues is linked to the referencing guidelines page.

• Interestingly, the electronic assignment inbox showed that the students’ level of engagement with feedback varied: some viewed the marks but did not access the detailed feedback; others read the comments but did not explore the hyperlinks. This has prompted us to run follow-up tutorials that students have to prepare for using tutor’s feedback. Overall, the trial was largely successful but highlighted the need for some more learner training in how to process e-feedback.

Sharing practice

• Because the online marking procedure used with the trial group was largely replicating our existing off-line marking procedure in a less time-consuming way, other module tutors were keen to experiment with e-assessments. The Programme Director and the Module Convenor were very supportive and allowed me to spend time on one-to-one consultations with team members in order to demonstrate the benefits of using e-assessment tools and train them if they wished to trial them.

Wider implementation

• Over the next couple of terms it was decided to introduce e- submission for all written coursework assignment in order to optimise the administration process. However, tutors were allowed the flexibility of marking online or downloading e- submissions in order to mark them in Word or print papers. This approach met our staff training needs and working styles.
The challenge at this stage was that the e-feedback and grades had to be transferred into the official feedback forms and spreadsheets for consistency purposes. In order to avoid multiple data entry, we decided to start using the Turnitin rubric and the Blackboard Grade centre. Creating a Turnitin rubric was easy and eliminated the need for calculating grades in excel documents and transferring them to a master spreadsheet. We have not moved away from excel documents completely but have significantly reduced the manual data entry load.

• By autumn 2016 all Academic Skills written assignments were submitted and graded online

IMPACT

Effect on the students

• Students find the new submission procedure, with a single submission point and an electronic receipt system, easier to follow.

• Many IFP students have used the opportunity to submit work remotely while visiting their families abroad during holidays.

• Many students are using Turnitin Originality reports as a formative learning tool that helps them see how well they have paraphrased or referenced source material and revise their drafts independently more, which has resulted in fewer cases of unintentional plagiarism.

• There is a greater transparency to learners as to how their mark was arrived at because they can see the number and type of QuickMarks comments that are associated with each criterion their work has been graded on.

• Generally, they now view e-submission and feedback as part of the daily university activities, which prepares them for the reality of the academic studies on their future degree courses.
Effect on the Tutors

• Using e-submission has decreased the burden of assessment administration: instead of sorting large volumes of student papers into sub-groups manually tutors use GradeCentre SmartViews to filter out their students’ submissions.

• Non-submitters are identified and sent a reminder earlier. In the past non-submitters could only be identified after the anonymous marking process has been completed, which often resulted in a hefty penalty. Now a tutor or the module convenor uses the “e-mail non-submitters” button right after the deadline to chase the students (even if marking is anonymous). As a result, students who failed to submit their assignment or uploaded to the wrong submission point receive an early reminder. For many IFP students, it is a learning curve and getting an early reminder helps them.

• Marking has become easier with Turnitin: tutors can manage the 15 days turnaround time better because they can start marking straight after the deadline and not have to wait until the printed copies are distributed. Many find QuickMarks hyperlinked to external practice or reference materials helpful as a way to feed forward without giving a lengthy explanation. Some tutors reported being slowed down by internet connection issues. It also took us some time to adjust to the
need Feedback Studio Interface.

• Using electronic assessment tools has also prompted a professional dialogue about our current assessment practices and highlighted the need for protocols on e-submissions, e- moderation and external examining. So it is great news that such guidelines are being developed as part of the EMA work.

• We have gained a better overview of IFP students’ engagement levels because GradeMark allows us to identify and contact non-submitters at one click. It also shows us the number of submission attempts and whether students have accessed feedback prior to tutorials. This helps us to support at risk students better.

Effects on the Module Convenor

• The module convenor has gained a better real-time overview of the marking process: number of scripts marked so far, marking analytics (average, standard deviation, range), all displayed in the GradeCentre column statistics. This has allowed the module convenor to support the tutors by re-distributing scripts or helping to mark and moderate.

• The module convenor can also see how much feedback is given to students across the board, which is important for quality assurance purposes.

• Dealing with possible cases of academic misconduct and late submissions has become easier thanks to Turnitin originality reports and electronic receipt system.

REFLECTIONS

• Our team’s experience has shown that it is worthwhile trying to integrate electronic assessment literacy into the course syllabus. It would also be great if there were university-wide learner-training sessions, similar to CQSD sessions offered to staff.

• Moving our module toward e-assessment was manageable
because our approach to electronic tools has been selective: where our current assessment practices worked well, we only sought to replicate them. When a change was needed, we looked for ways technology could be used to implement it.

• Sharing best practice and providing peer support has proven to be a good way of encouraging more colleagues to use e- assessment tools, because it was not perceived as a top-down driven change.

• Having the programme management support has really helped our small community of e-practitioners to grow. Creating training opportunities and allowing some flexibility during the transition to e-practice have been key to its success. There was a point when our exploratory e-assessment practices needed to be more standardized and programme level decisions were key to maintaining consistency of practice.

FOLLOW UP

Following the successful trial of the e-assessment tools on the Academic Skills and International English Modules, the programme management is keen to encourage other IFP modules to trial them.

In Spring 2017, a member of the Blackboard Team delivered a Staff Development Session on GradeMark to the IFP team.

We are currently exploring the possibility of doing our internal and external moderation electronically.

ELECTRONIC FEEDBACK AND GRADING METHODS – Dr Geoff Taggart

Profile picture for Dr Taggart

Dr Geoff Taggart is a lecturer in the Institute of Education and Programme Director for the Early Years Practice programme at Reading. As part of his secondment to the EMA programme, Geoff decided to run a focus group with students from the IoE to gather perspectives on electronic feedback and grading methods.

OBJECTIVES

To identify student views on:

• The perceived benefits of the three forms of most commonly- used feedback offered by Grademark (i.e. Quickmarks, rubrics and text comments)

• Preferences regarding the emphasis which each form of feedback should be given in a typical piece of work

• Views regarding the interrelationship of the different forms of feedback

CONTEXT

The focus group was composed of 4 MA students (2 international and 2 home), plus one Chinese academic visitor with recent experience of being a student. Their views were therefore representative of students engaged in social science disciplines and may not be transferable to other fields. Also in attendance were myself, Dr Maria Kambouri (engagement in feedback project) and Jack Lambert-Taylor (EMA). It took place at London Road campus between 5 and 6.30pm on Thurs 18th January.

IMPLEMENTATION

I provided participants with three copies of the same assignment, one marked exclusively with Quickmarks, one marked only with the final text comment and one marked solely according to the rubric. The purpose of this was to isolate and focus attention upon each of the three kinds of electronic feedback provided through the Feedback Studio.

The marking was not meant to be typical (nor as examples of best practice) but to highlight the positive and negative qualities of each kind of feedback. For example, there were a lot more quickmark comments appended to the assignment than would usually occur. The purpose of this was to emphasise both the positive benefits of maximised contextualised feedback and the negative impression of ‘overload’ which the comments could give. Additionally, the text comments amounted to over 2500 words and were extremely conversational and wide-ranging.

In a similar way, whilst this strategy deliberately emphasised the dialogical and personal nature of this feedback method, it was also not easy to straightforwardly pick out those points where the student needed to improve. By contrast, the rubric does this very clearly but is not a personal way of providing feedback.

REFLECTIONS

Quickmark feedback

• Students appreciated Quickmarks which contained hyperlinks (e.g. to Study Advice)

• One participant noted that they didn’t like the Quickmarks, on the basis that when printed the document does not have interactive links. The same participant suggested that excessive Quickmarks may be intrusive, and give the impression of ‘massacring’ a student’s work. They agreed that less excessive use would be preferable. The same participant noted that there was ‘no positive’ or ‘constructive’ feedback on the page- only problem points. This may be due to the nature of the sample work, which was deliberately of a poor standard; perhaps the same study should be conducted with a high quality piece of work.

• Another participant noted that narrative summaries can come across as more personal, particularly if negative, and that they preferred Quickmarks on the basis that they provided a more objective tone. Another participant suggested that Quickmarks may come across as more ‘humane’ on that basis, rather than a ‘rant at the end’.

• Another participant suggested that Quickmarks provide good evidence of the thoroughness of the marking process.

• One participant suggested that Quickmarks could indicate to which assessment criteria in the rubric it refers. The facility to do this was explained

• It was noted that Quickmarks should be written passively rather that directed at the author, as it can appear more accusatory. For example, ‘The point is not clear here’ as opposed to ‘you have not been clear here’.

Summary – Quickmarks should be limited in their use, include positive as well as negative comments, include relevant hyperlinks and be focussed on the assignment rather than the student and associated with rubric criteria where possible.

Text comments

• Two participants suggested that narrative summary can provide more detailed feedback and valued the conversational tone. It was also suggested that Quickmarks may be perceived as momentary thoughts without reflection, whilst narrative summary may come later after further thought.

• One participant noted that when you write an essay you aren’t ‘just trying to tick boxes in a rubric, you are trying to say something’. This was a really interesting point which emphasised the student expectation of a personal, dialogical relationship with their tutor (something which rich text comments support).

• Several participants noted that marking with more narrative summary would be more time-consuming, and expressed empathy for academics doing so.

• It was also noted that narrative summary would be better-fitted to a conversation in person, and that subtleties within the feedback would be better expressed through intonation in the voice and facial expressions of the marker. Absent those features, it can come across as very serious, and lacks intricacy.

• Students commented that this kind of feedback can also become too ‘waffly’ and lack focus.

Summary – This kind of feedback gives the strongest impression that the tutor has considered the assignment overall, mulled it over and arrived at a holistic impression, something that was highly valued (contrast with: ‘a marked rubric alone shows that the tutor perhaps didn’t think about it that much’). However, the writing needs to be clearly focussed on specific ways in which the student can improve (i.e. bullet points).

Rubric

• Students commented positively that the rubric showed very clearly how successful an assignment had been in general terms. However, they were concerned that it does not explain how to improve if you have not done very well.

• Students questioned how the final mark is actually calculated through the use of a qualitative rubric where the different elements are unweighted – this was considered to lack full transparency.

• It was unanimously agreed that a rubric without comments was not a preferable form of feedback on its own due to lacking feed-forward information, despite the fact that the adjacent rubric statements (i.e. in the next grade band up) also appear to students in the feedback.

• Students did not like the way in which the rubric statements were represented in a consecutive list (see below) when printed off. They much preferred the grid they were used to (i.e. with grade boundaries as the columns and rubric criteria as the rows).

Summary – a rubric is useful in showing how successful an assignment has been in a broad and general sense. The only way in which it could be more useful would be if the rubric were more specific to this particular assignment (and so have multiple rubrics across programmes/the School)

CONCLUSIONS

1. All forms of feedback, taken together, were considered to be useful.

2. The three different forms of feedback need to support each other (e.g. the rubric needs to reflect the written comments, tutors could use the same language in their text comments as that used in the rubric statements)

3. No matter the means by which feedback is given, students want to feel as though their work has made an impression on their tutor.

4. If tutors want to mark mostly through Quickmarks and rubrics (and provide greatly reduced written comments), this may be perceived negatively by students who expect a more personalised response.

FOLLOW UP

The following points may require consultation from Blackboard:

• One participant suggested that different colours may be used to indicate whether quickmark feedback is positive or negative.

• A tutor suggested that it would be helpful if tutors could have flexibility about where to position their Quickmarks in their set, otherwise they just appear rather randomly. This is an issue when marking at speed. )

• All participants suggested that they like the use of ticks in marking, but no alternative was suggested. Can a tick symbol be included in the quickmark set?

• Tutors are able to expand the rubric when marking. Can it be presented to students in this format?

LINKS

Quickmarks:

https://guides.turnitin.com/01_Manuals_and_Guides/Instructor_Guides/Turnitin_Classic_(Deprecated)/25_GradeMark/QuickMark

Rubrics:

https://guides.turnitin.com/01_Manuals_and_Guides/Instructor_Guides/Turnitin_Classic_(Deprecated)/25_GradeMark/Rubrics_and_Grading_F orms

Text comments:

https://guides.turnitin.com/01_Manuals_and_Guides/Instructor_Guides/Feedback_Studio/Commenting_Tools/Text_summary_comments