Using grademark to write high quality feedback more rapidly in the school of law – Dr Annika Newnham

Profile picture for Dr Newnham

Dr Newnham is the module convenor for LLB Family Law. Her areas of interest include, Child Law, Autopoietic Theory, The Common Intention Constructive Trust.

Since 2015, Annika has gradually personalised the ‘Quickmarks’ function within Turnitin Grademark to be both discipline specific and also assignment specific. In addition, Dr Newnham has also developed a lengthy comments bank which she can draw on and personalise to ensure that she can write high quality feedback more quickly, speeding up the entire marking process.

OBJECTIVES

The School of Law currently operates online submission, marking and feedback for the vast majority of assessed work. As part of this process it makes extensive use of Turnitin Grademark and some of the functionality on offer, including Quickmarks. Given the large numbers of students submitting work within the School and the need to provide high quality feedback quickly, I wanted to use these new tools to speed up the entire marking process and support the quality and quantity of feedback offered.

CONTEXT

The School of Law recruits strongly, makes extensive use of summative assessment and maintains a large number of core modules. Online assessment has been adopted, in part, to help support the continued provision of high quality feedback in this context while ensuring that feedback is returned to students within 15 working days.

IMPLEMENTATION

Grademark allows for the customisation of Quickmarks by individuals markers. I very quickly began to customise the Quickmarks that were available to me by adding comments that I make frequently. Gradually, over time, my Quickmarks section has expanded to include a whole series of comments which range from just a few words to more lengthy sections of text. Dragging these across to relevant sections of text saves me a considerable amount of time because I’m not writing out the same type of comment again and again. I’ve even developed my set of Quickmarks to be specific not only to each module I teach but to each assignment I mark within that module. I carefully save each set with a different name so I can easily access them again. Grademark even remembers my Quickmarks sets from one year to the next so my collection appears automatically when I open each new essay.

I wanted to explore the possibilities of reducing marking time whilst maintaining the quality and quantity of feedback in other areas.
This approach worked very well for targeted in text comments throughout the essay but, like most markers, I also leave summative text in the general comments section in the Grademark sidebar so that students have a sense of my overall thoughts. I started to compile a lengthy list of comments that I use extensively in a simple and separate word document. I ordered each set under key headings. Some of these are generic for all essays: writing style, referencing, structure and so on. There are also sets of comments on how students have tackled a particular issue in law, for example how well they have presented balanced arguments on commercial surrogacy, or have understood the different stages of a cohabitant’s claim for a share in her ex-partner’s house. Each heading contains 8-10 different sentences or longer sections covering a wide range of different areas I may want to comment on. I am then able to cut and paste the most relevant into the Grademark comment box and, if needed, rewrite to suit the exact statement for the specific essay I’m working on. This process has become even more efficient since the arrival of a second screen. I can list my commonly used statement on the left hand screen, cut and paste or drag over to the actual essay on my right hand screen. Although I might then want to personalise the statement I still save a significant amount of time in comparison to typing everything out repeatedly for each essay.

IMPACT

I maintain a balance between the use of Quickmarks, my comments bank and specific comments written for each piece of work. Students should not receive exactly the same comments time and time again. Feedback should not become a highly mechanised process. But Quickmarks and comments banks can be used as a starting point or work alongside very specific comments written for a particular piece of work. In this way I can maintain the quality and quantity of my feedback whilst speeding up the marking process considerably. In particular, this approach seems to ensure greater consistency between essays in terms of the amount of feedback that each student receives because it is so much quicker and easier to insert comments. More broadly it feels like a far more efficient process and is certainly a more fulfilling task to undertake.

REFLECTIONS

Quickmarks and cut and paste comments have made marking feel much less like a chore; and removes the irritation often felt if you have to correct the same misunderstanding again and again to different students.

LINKS

Turnitin Quickmark

https://guides.turnitin.com/01_Manuals_and_Guides/Instructor_Gu ides/Turnitin_Classic_for_Instructors/25_GradeMark/QuickMark

Feedback via audiofiles in the Department of English Literature – Professor Cindy Becker

Profile picture for Prof. Becker

Cindy Becker is the Director of Teaching and Learning for the School of Literature and Languages and also teaches in the Department of English Literature. She is a Senior Fellow of the Higher Education Academy and has been awarded a University of Reading Teaching Fellowship. She is an enthusiastic member of several University Communities of Practice: Placement Tutors, University Teaching Fellows, Technology Enhanced Learning, and Student Engagement Champions.

Cindy is a member of Senate and has sat on university steering committees and working parties; she is also a member of the Management Committee for the School of Literature and Languages and chair the School Board for Teaching and Learning. She is the convenor of Packaging Literature and Shakespeare on Film.

In September 2015 she started to trial the use of the audio feedback function within Turnitin’s online marking tool (GradeMark). This innovative approach did present some initial challenges but, overall, it proved to be a great success for both Cindy and her students.

OBJECTIVES

GradeMark was introduced to the University in the Summer of 2015. I wanted to use this new marking tool to explore different ways of providing feedback for students. In particular, I wanted to adopt a more personal approach and provide more in-depth feedback without significantly increasing the time I spend marking each essay.

CONTEXT

GradeMark allows you to produce typewritten feedback for assessment work and this is what most of us are used to. However, it will also let you click on an icon that allows you to create an audio file of up to three minutes of spoken feedback instead.

IMPLEMENTATION

I started off by making notes as I marked the essay and then talking through them on the audio file. It did not work very well because my feedback became stilted, took longer than three minutes and was time consuming to prepare. I think I lacked confidence at the outset.

Now I take a more relaxed approach. I make no more than a couple of notes (and often not even that) and then I simply press the record button. As I talk to the student I scroll down the assignments on the split screen and this is enough to jog my memory as to what I want to say. Taking a methodical approach has helped me. I always begin with an overview, then work on specific challenges or praiseworthy elements, then end with a brief comment summing up my thoughts. If it
goes wrong, I simply scrap the recording and begin again. I save myself time with the uploading by setting it to upload and then begin to work on the next assignments. This saves the frustration of staring at an upload symbol for ages when you want to get on with it.

IMPACT

It is worth the effort.

For now, students love it. I asked students to let me know whether they would prefer written or audio file feedback and those who responded voted for audio file. The novelty factor might wear off, but I think at the moment it is a useful way to engage students in our assessment criteria and module learning aims, in class and beyond.

For now, I love it. It is a pleasant change; it is quicker and fuller than written feedback. It seems to allow me to range more widely and be more personally responsive to students through their assignments. Because I am ‘talking to them’ I have found myself more ready to suggest other modules they might like, or some further reading that they might enjoy.

REFLECTIONS

It can take a few attempts to ensure that your headphones are working within the system. This is most usually a problem with GradeMark or Blackboard more generally – restarting Blackboard or even your computer will fix it. You might not have headphones already to hand, and that sounds like another investment of time and money, but it’s good idea to buy cheap headphones – they cost around £20 from a supermarket and are perfectly adequate for the job. You feel like a twit talking to your computer. Of course you do – who wouldn’t? After your first few audio files it will feel perfectly natural.

For the future, I can see it having an impact on assignment tutorials. I believe I can have an equal impact via a tutorial or a three minute audio file, and everyone actually listens to their audio file. I am going to have to decide what to do with the extra ‘spare’ contact time this might give
me…

Changing the assessment experience of professional staff in SAPD – Emily Parsons

Profile picture for Emily Parsons

Emily Parsons is a Senior Programme Administrator in the School of Agriculture, Policy and Development (SAPD). Online assessment has been adopted throughout the SAPD, impacting academic and non- academic colleagues. In this case study, Emily outlines the experiences of her Support Centre team working with SAPD as an Early Adopter School.

OBJECTIVES

To reduce the administrative burden of assessment and improve the overall assessment experience for staff within the Support Centre whilst supporting change within the School.

CONTEXT

The University has a long-term vision to move toward online assessment, where practical, and improve underlying processes. SAPD became an Early Adopter School in May 2017 which allowed the EMA Programme to support a significant shift away from a mixture of online and offline marking to the full provision of online marking where practical. The SAPD Support Centre was involved right from the start working collaboratively with the EMA, TEL, CQSD and senior school leadership team during the change process. The Support Centre was one of the first to experience the impact on their working practices of a shift towards greater online marking throughout 2017-2018.

IMPLEMENTATION

As an Early Adopter School, SAPD undertook a full change programme to support online submission, feedback and grading as well as support for all underlying processes. A series of meetings and three major workshops lasting between three and four hours were held throughout the Summer involving all collaborating teams.

Initially only two members of the Support Centre team were involved but representation quickly expanded to include at least four members. It was really important to make sure that a range of professional staff views were being heard during the change planning stage particularly because all of these colleagues would play a role in implementing new processes and delivering change.

Each of these collaborative workshop meetings drew everyone together, in person, in one room instead of relying on e-mail correspondence. This proved far more effective. Relying on e-mail could have significantly delayed the process and may not have led to the kind of in depth, rich discussion around assessment practice, process and policy within the School that was seen at each meeting.

One of the triggers for these debates was the creation of a series of highly detailed process flow diagrams showing the end to end assessment process within the University. These process maps outlined who does what and when in four main diagrams – anonymous marking using the Blackboard marking tool, named marking using the Blackboard tool, anonymous marking using Turnitin, and named marking using Turnitin. These maps were essential to understanding the end to end process and for allowing the School to start thinking about consistent practices.

Following this approach to consistent practice professional staff also created a manual containing essential information such as how to set up submission points or Turnitin similarity reports in the way that the School wanted. All professional staff could then follow this detailed guidance. This proved essential to ensure that all colleagues were working in a similar way.

IMPACT

Two key areas of impact have been experienced within the Support Centre – the first surrounds the adoption of more consistent processes to deal with the submission, receipt, marking and moderation of coursework, and the second surrounds the significant increase in amount of work marked online.

The adoption of more consistent processes was made possible by the creation of the detailed process diagrams outlined above. These show the 45-50 steps involved from submission to final exam board agreement and confirmation, including who does what exactly, and when. The creation of these process diagrams during the Summer workshops, informed by all of the groups involved was, in itself, a useful exercise. We could take a step back and really think about how we could make this process as efficient and as effective as possible whilst keeping an element of flexibility to cover any type of submission or new requirement that we collectively hadn’t thought of!

During the workshops, the Support Centre, in collaboration with the School, was also asked to create a large assessment spreadsheet listing all submissions due to be submitted during the academic year. The creation of this detailed assessment spreadsheet, in itself, provided an opportunity for colleagues to pause and review the amount of assessment and the School’s use of different assessment types.

This was also a crucial starting point from which we could categorise assessment types (such as group work, individual essay, video submission) and then think through which of the two marking tools – Blackboard or Turnitin – would be most appropriate for each type. Both the process diagrams together with these spreadsheets helped to support workflow and planning within the Support Centres who then knew exactly what they had to do and when, for the full academic year.

Under the new, more consistent. processes, academic colleagues were no longer required to create submission points. This role was transferred to professional staff and actually represented one of the most significant changes undertaken. All submission points are now created in the same way -for example there is no longer any variation within the School surrounding student views of Turnitin reports as all students only see similarity reports after the submission deadline. In general, academic colleagues were happy to transfer the set-up of submission points to professional staff and just had to inform the Support Centre, in advance, when assessment was due. Around 400 pieces of assessment were due during 2017- 2018.

Alongside increased consistency surrounding processes, the School has seen significant increases in the amount of work submitted and marked online. Overall this change has improved the assessment experience for colleagues within the Support Centre in a number of ways:

• Previously, using a rota system, colleagues were allocated a time slot to sit in the front office to receive hard copies and process each paper coming in. This was an intense role and so reduced the time available to undertake any other supporting role. There is no need to do this in the current system as submission is managed online for almost all work. This represents a significant time saving for colleagues.

• At the end of the marking process, each paper would also have to be sorted alphabetically and placed in individual envelopes, ready for collection by students. This doesn’t happen now for the vast majority of pieces which are accessed online. In the past this role might have taken half a day. Now it takes an estimated 30 minutes for the small amount of assessment still marked in hard copy. The time saved has been described by professional staff within the team as “extraordinary”.

• This also means that the assessment process has become much more scalable. Support Centres can cope with increases in students without seeing significant increase in workload.

• The Support Centre used to ask academic colleagues to return marked work to them within 14 working days of submission to allow time for processing. There is no need to do this anymore because the marks and feedback are returned online so academic colleagues now have the full 15 working days to mark submitted work,

• The Support Centre is no longer drowning in a sea of paper leaving much more room and saving storage space. This was a particular problem when students failed to come back to collect their work.

• Some of the functions of the marking tools are saving a significant amount of time for the Support Centre. One example surrounds non-submission. It took a considerable amount of time to contact students who had failed to submit work when they were submitting hard copies. Now Turnitin allows professional staff to send one e-mail to all non-submitters easily and very quickly.

• Previously, in order to undertake internal moderation, Support Centre staff would release marks but keep the hardcopy coursework, which included their feedback, back from the students until internal moderation had taken place. After this point, the full feedback would also be released. In order to undertake external moderation, Part 2 and Part 3 students were asked to create a portfolio of their work, including marks and feedback, and submit this at the end of the academic year so that external examiners could review the work. Student engagement in this process was variable with some students having lost their work by this point. In addition, these processes generated a huge amount of paper and took a large number of working hours to manage. This isn’t necessary anymore, aside from a very small amount of fieldtrip work. Internal and external moderators can access both marks and feedback quickly and easily online, from wherever they are in the country.

REFLECTIONS

Moving the School towards more consistent approaches to managing assessment and increasing online marking and feedback has largely been a very positive experience for the Support Centre. We are now enjoying a range of benefits which have made our role within the assessment cycle much more manageable.

We had worried that some areas of work might increase – for example, we might have seen more reported cases of academic misconduct as a result of much greater use of Turnitin similarity reports. This has not occurred but the School had been undertaking a significant amount of work in this area including the introduction of a formative piece of work at Part 1 and at the start of the MSc programmes which is then analysed during follow on seminars.

As we move forward into the next academic year, there are still some areas that we need to think about a little more. We’ve discovered through this processes, for example, that there are multiple different ways in which academic colleagues assess and give feedback on presentations. We need to work on understanding the processes in this area more in 2018-2019.

This year we will also be able to start the process of collecting new assessment data and deadlines much earlier. This will enable us to create submission points around July and August. This will place us in a better position to plan ahead for 2018-2019.

Reflecting on change and the management of non-standard submissions in Typography – Dr Jeanne-Louise Moys

Profile picture for Dr Moys

Jeanne-Louise teaches design practice, theory and research skills across a range of genres and platforms. She is the Programme Director for the MA Creative Enterprise and the Pathway Lead for the MA Communication Design (Information Design Pathway).

OBJECTIVES

Typography has been keen to continue to support the move from offline to online submission, feedback and grading, where possible. In particular, the Department has wanted to ensure a more consistent and streamlined approach to managing assessment, especially given the range of diverse submission types within Typography programmes. The Department were also very keen to ensure that online marking tools allowed colleagues to provide feedback that supports students’ design literacy. In this respect, markers aim to give feedback designed to allow for openness in the ways students think and that builds students’ confidence to develop their own design judgement.

CONTEXT

The University has a long-term vision to move toward online assessment, where practical, and improve underlying processes. In 2015–6, the Department of Typography adopted a policy of either online submission or dual submission (where students are asked to submit both an online digital ‘copy’ and in material form as relevant to the particular deliverables of different design briefs) across the undergraduate degree. Paper-based feedback forms were replaced with online rubrics. The Department mainly made use of Blackboard as a marking tool but with some further use of
Turnitin, particularly for essay based assessment. The Department has undertaken this change in the context of growing student numbers, increasing diversity of student cohorts and growing numbers of international students. The trends have increased the need to adopt more efficient and streamlined assessment processes.

IMPLEMENTATION

Over the past four years the Department has supported student online submission and the increased use of marking tools. In 2014, The Head of Department and I initially worked together to explore different online tools to find sustainable assessment practices for increasing cohorts. We liaised with our IT partners who encouraged us to work with Maria Papaefthimiou – as they were aware that the University was setting up a new TEL team. Maria introduced us to Blackboard rubrics, which we piloted for both practical and written forms of assessment.

These early initiatives were reviewed ahead of our decision to adopt online assessment for all undergraduate coursework (with a few exceptions such as technical tasks, examinations and tasks where self or peer assessment plays a particular role in the learning process). I then translated our paper-based forms into a set of Blackboard rubric templates for colleagues to work with and provided a workshop and video resources to support the transition.

For almost every submitted piece of work, students receive feedback from colleagues using either Turnitin or the Blackboard marking tool. Each piece has an online submission point so that colleagues can provide feedback online, often using the rubrics function within the Blackboard marking tool.

One of the challenges faced by the Department has been managing non-standard types of submission. Typography employs a particularly broad range of assessment types including self- and peer-assessment and group work. It also handles a range of different physical submissions such as books or posters and assessment involving creating designs like websites and app prototypes that exist only in digital form.

Because of the nature of the work, dual submission is common. Our policy of online submission for written work and dual submission for practical work ensures that – regardless of the nature of the work – students receive feedback and grades in a consistent manner throughout their degree.

More recently, we have introduced some new practices that support the development of professional skills and enhance the transparency of group work. For example, professional practice assignments use a project management app, Trello. Students are assessed on their usage and the content (including reflection) they input into the app. The tutor can, for example, set up a Trello group and monitor group activity. Some practical modules require students to use prototyping software or create videos. In these cases, it might be easier for students to share links to this content either by submitting the link itself online to Blackboard or to a dedicated Typography submission e-mail address monitored by administrative colleagues (although this second approach may change as we work with the EMA Team).

A second issue faced by the Department during implementation, as a result of the significant diversity of assessment, is that the management of online submission can become confusing for students in terms of what exactly they should submit and how. The diversity of assessment allows students to demonstrate a range of learning outcomes and broad skills base but the Department has had to ensure that students fully understand the range of submission practices. This challenge exists both in Part 1 when students are being introduced to new practices and in Parts 2 and 3 where a single design brief may have multiple deliverables. We are continually working to find the best balance between ensuring the kind of submission is always appropriate to the learning outcomes, provides students with experience in industry standard software and tools, and is accompanied by clear guidance about submission requirements.

IMPACT

The shift from offline to online assessment within the Department has led to a range of changes to the staff and student experience:

1. Online feedback for students has meant that they now always know where their feedback is. There is no need for them to contact their tutors to access content.

2. For some staff, the use of online marking and feedback has meant spending some time getting used to the interface and learning about the functionality of the tools, particularly the Blackboard marking tool. There have been some issues surrounding the accessibility of rubrics within Blackboard and their consistent use, which the Department has had to work through. In general colleagues are now reporting that online marking has significantly reduced marking time, especially where more detailed rubrics have
been developed and trialled in the current academic year.

3. The Department has spent time thinking carefully about the consistency of the student assessment experience and making the most of the functionality of the tools to make marking easier and, potentially, quicker. As a result, there is a sense that the practices adopted are more sustainable and streamlined, which has been important given rising student numbers and increasingly diverse cohorts.

REFLECTIONS

Over the last year, following recommendations from Periodic Review, the Department has been trialling different practices such as the creation of much more detailed rubrics. As noted above, detailed rubrics seem to reduce marking and feedback time, while providing students with more clarity about the specific criteria used to assess individual projects. However, these do not always accommodate the range of ways in which students can achieve the learning outcomes for creative briefs or encourage the design literacy and independent judgment we want students to develop.
We are also working on ensuring that the terminology used in these rubrics is mapped appropriately to the level of professional skill expected in each part of the degree. The Department is currently looking at the impact of this activity to identify best practice.

Typography is keen to continue to provide a range of assessment options necessary for developing professional skills and industry- relevant portfolios within the discipline. We are committed to complementing this diversity with an assessment and feedback process that gives students a reassuring level of consistency and enables them to evaluate their performance across modules.
There is some scope to develop the marking tools being used. It would, for example, be very helpful if Blackboard could develop a feature where students can access their feedback before they can
see their marks or if it allowed colleagues to give a banded mark (such as 60-64), which is appropriate formative feedback in some modules. In addition, Typography students have reported that the user experience could be improved and that the interface could be more intuitive. For example, it could contain less layers of information and access to feedback and marks might be more direct.

More broadly, the shift from offline to online practices has been one driver for the Department to reflect on existing assessment practices. In particular, we have begun to consider how we can better support students’ assessment literacy and have engaged with students to review new practices. Their feedback, in combination with our broader engagement with the new Curriculum Framework and its impact on Programme Level Assessment, is informing the development of a new set of rubric templates to be adopted in autumn 2018.

LINKS

For further information please see the short blog, ‘Curriculum Review in Practice Aligning to the Curriculum Framework-first steps started at:
http://blogs.reading.ac.uk/engage-in-teaching-and- learning/2018/04/09/curriculum-review-in-practice-aligning-to- the-curriculum-framework-first-steps-started-by-jeanne-louise- moys-rob-banham-james-lloyd/

Pre-sessional English use of Turnitin’s online marking tool – Rob Playfair, IFP Course Tutor

OBJECTIVES

I was interested in improving the efficiency of my marking, and liked the idea of having a digital record of written feedback to students. During the PSE induction for new tutors we were told that the University is moving towards e-feedback over the next few years so it seemed like a useful skill to acquire.

CONTEXT

My group of international students were on a 9 week course to improve their level of English before starting their postgraduate studies. They needed to write three 500 word essays and one 1500 word project. For each of these, students wrote two drafts. I needed to provide written feedback on both drafts and the final version of each essay, i.e. a lot of marking!

IMPLEMENTATION

Jonathan Smith, PSE course director and ISLI TEL Director, gave all teachers a one-hour workshop on how to use Turnitin and Grademark, during which we had a chance to get hands on with the software. Each year Jonathan runs a training session for new members of PSE staff who will work on the PSE courses during the summer.

Later, Jonathan shared the PSE ‘QuickMarks’, with those of us who had opted to use e- feedback. We could download these, via our QuickMarks library, into our own personal QuickMarks set. These comments were then available each time we opened an essay.

The QuickMarks focussed on common student errors with explanations and links to relevant sources. ‘Quickmarks’ are based not only on common grammar and lexical errors but also on the complexity of the structures used and coherence and cohesion in the texts.

Students grew accustomed to submitting work, accessing feedback and seeing their progress.

IMPACT

• It was quicker to note common student errors in-text using the QuickMarks, than repeatedly hand writing the same comments.

• Students were able to read & start acting on my feedback as soon as I did it, rather than waiting until the next class.

• I could quickly refer to previous drafts and the comments I had given to monitor uptake.

• I could browse work from students who were not in my class, via the Turnitin feedback suite, to see a broader range of essays and also see the feedback that colleagues were giving because, in this case, the point of submission was the same for the whole cohort. As this was my first experience teaching the programme, this was particularly useful.

REFLECTIONS

The speed of communication with students was the biggest benefit – as soon as my marking was done students could see it. This meant that students could formulate questions about my feedback before class, making the time in class much more productive.

In terms of quality of marking I think there might be a tendency to over-mark using the QuickMarks, because it only takes a second to add a one yet creates quite a lot for the student to do – reading an explanation and perhaps visiting a website. I’d like to explore the impact of this on uptake.

Finally, on a practical level I found this helped my organisation – all the scripts, scores and comments are in one place. It was also easier to submit scripts for moderation: I just gave the names of students to the Course Directors who could go into the system and see the scripts themselves.

FOLLOW UP

• I’m currently using it in a similar way on the International Foundation Programme (IFP).

• At present all students can do is upload their work then download my comments. I’d be interested in a function which allows students to respond to my comments – making corrections or asking questions. This would support the feedback cycle.

• To improve the reliability of the summative scores, I wonder whether we can learn from elements of comparative judgment programmes such as No More Marking.

LINKS

www.nomoremarking.com

http://www.reading.ac.uk/internal/ema/ema-news.aspx

https://www.reading.ac.uk/ISLI/study-in-the-uk/isli-pre-sessional-english.aspx

Rethinking assessment design, to improve the student/staff experience when dealing with video submissions

Rachel Warner, School of Arts and Communication Design

Rachel.Warner@pgr.reading.ac.uk

Jacqueline Fairbairn, Centre for Quality Support and Development

j.fairbairn@reading.ac.uk

Overview

Rachel in Typography and Graphic Communication (T&GC) worked with the Technology Enhanced Learning (TEL) team to rethink an assignment workflow, to improve the student/staff experience when dealing with video submissions. Changes were made to address student assessment literacies, develop articulation skills, support integration between practice and reflection, and make use of OneDrive to streamline the archiving and sharing of video submissions via Blackboard.

This work resulted in students developing professional ‘work skills’ through the assessment process and the production of a toolkit to support future video assessments.

Objectives

  • Improve staff and student experiences when dealing with video assignment submissions. Specifically, streamlining workflows by improving student assessment literacy and making use of university OneDrive accounts.
  • Support students to develop professional skills for the future, through assessment design (developing digital literacies and communication skills).
  • Provide an authentic assessment experience, in which students self-select technologies (choosing software and a task to demonstrate) to answer a brief.

Context

The activity was undertaken for Part 1 students learning skills in design software (e.g. Adobe Creative apps). The assignment required students to submit a ‘screencast’ video recording that demonstrated a small task using design software.

Rachel wanted to review the process for submitting video work for e-assessment, and find ways to streamline the time intensive marking process, particularly in accessing and reviewing video files, without compromising good assessment practice. This is also acknowledged by Jeanne-Louise Moys, T&GC’s assessment and feedback champion: “Video submissions help our students directly demonstrate the application of knowledge and creative thinking to their design and technical decisions. They can be time-consuming to mark so finding ways to streamline this process is a priority given our need to maintain quality practices while adapting to larger cohorts.’”

The TEL team was initially consulted to explore processes for handling video submissions in Blackboard, and to discuss implications on staff time (in terms of supporting students, archiving material and accessing videos for marking). Designing formative support and improving the assessment literacy of students was also a key driver to reduce the number of queries and technical issues when working with video technologies.

Implementation

Rachel consulted TEL, to discuss:

  • balancing the pedagogic implications of altering the assignment
  • technical implications, such as submission to Blackboard and storage of video

To address the issue of storing video work, students were asked make use of OneDrive areas to store and submit work (via ‘share’ links). Use of OneDrive encouraged professional behaviours such as adopting a systematic approach to file naming, and it meant the videos were securely stored on university systems using a well-recognised industry standard platform.

To further encourage professional working, students were required to create a social media account to share their video. YouTube was recommended; it is used prolifically by designers to showcase work and portfolios, and across wider professional settings.

Students were provided with a digital coversheet to submit URLs for both the OneDrive and YouTube videos.

The most effective intervention was the introduction of a formative support session (1.5hr). Students practiced using their OneDrive area, set up YouTube accounts and reviewed examples of screencasts. This workshop supported students to understand the professional skills that could be developed through this medium. The session introduced the assessment requirements, toolkit, digital coversheet and allowed students to explore the technologies in a supported manner (improving students’ assessment literacy!)

The assignment instructions were strategically revised, to include information (‘hints and tips’) to support the students’ development of higher production values and other associated digital literacies for the workplace (such as file naming conventions, digital workflows, and sourcing online services).

Students were provided with the option to self-select recording/editing software to undertake the screencast video. Recommended tools were suggested, that are free to use and which students could explore. ‘Screencast-o-matic’ and ‘WeVideo’ provide basic to intermediate options.

Impact

Marking the submissions was made easier by the ability to access videos through a consistent format, using a clearly structured submission process (digital coversheet). The ability to play URL links directly through OneDrive meant Rachel was able to store copies of the videos into a central area for future reference. Students also provided a written summary of their video, highlighting key video timings that demonstrate marking criteria (so the marker does not have to watch whole video).

Rachel rationalised her approach to marking by developing a spreadsheet, which allowed her to effectively cross reference feedback against the assessment criteria (in the form of a rubric) and between assignments. This greatly speeded up the marking workflow and allowed Rachel to identify patterns in students work, where common feedback statements could be applied, as appropriate.

The assessment highlighted gaps in students existing digital literacies. The majority of students had not made a video recording before and many were apprehensive about speaking into a microphone. After the completion of the screencasts, previously unconfident students noted in their module reflections that the screencast task had developed their confidence to communicate and explore a new technology.

Reflections

The modifications to the assessment:

  • Reflected professional digital competencies required of the discipline;
  • Allowed students to explore a new technology and way of working in a supported context; and,
  • Built confidence, facilitated assessment literacy, and encouraged reflection.

Future modifications to the screencast submission:

  • Peer review could be implemented, asking students to upload videos to a shared space for formative feedback (such as Facebook or a Blackboard discussion board).
  • The digital coversheet had to be downloaded to access URL links. In future, students could paste into the submission comment field, for easier access when marking.
  • Rachel is developing a self-assessment checklist to help students reflect on the production values of their work. The summative assessment rubric is focused on video content, not production values, however, it would be useful for students to get feedback on professional work skills. For example, communication skills and use of narrative devices which translate across other graphic mediums.

Toolkit basics:

a thumbnail image of a toolkit document, full access available via links in webpage

  • Outline task expectations and software options, give recommendations
  • Source examples of screencasts from your industry, discuss with students.
  • Provide hints and tips for creating effective screencasts.
  • Provide submission text. Consider asking students to use the ‘submission comment’ field to paste links to their work, for quick marker access to URLs.
  • Plan a formative workshop session, to practice using the software and go through the submission process (time invested here is key!).
  • Create a self-assessment checklist, to enhance the production quality of videos and highlight transferrable skills that can be developed by focusing on the quality of the production.
  • Consider creating a shared online space for formative peer-feedback (e.g. Blackboard discussion forum).
  • Consider using a marking spreadsheet to cross-reference feedback and highlight good examples of screencasts that can be utilised in other teaching.

Links

Screencast example: (YouTube link) This screencast was altered and improved after submission and marking, taking onboard feedback from the assessment and module. The student noted ‘After submission, I reflected on my screencast, and I changed the original image because it was too complex to fit into the short time that I had available in the screencast. I wanted to use the screencast to show a skill that I had learned and the flower was simple enough to showcase this’. Part of the module was to be reflective and learn from ‘doing’, this screencast is an example of a student reflecting on their work and improving their skills after the module had finished.

Screencast example: (YouTube link) This screencast was a clear and comprehensive demonstration of a technique in PhotoShop that requires multiple elements to achieve results. It has a conclusion that demonstrates the student’s awareness that the technique is useful in other scenarios, other than the one demonstrated, giving the listener encouragement to continue learning. The student has used an intro slide and background music, demonstrating exploration with the screencast software alongside compiling their demonstration.

Screencast example: (YouTube link) This demonstrates a student who is competent in a tool, able to use their own work (work from another module on the course) to demonstrate a task, and additionally includes their research into how the tool can be used for other tasks.

Other screencast activity from the Typography & Graphic Communication department from the GRASS project:  (Blog post) Previous project for Part 1s that included use of screencasts to demonstrate students’ achievements of learning outcomes.

Celebrating Student Success Through Staff-Student Publication Projects

Dr Madeleine Davies, Department of English Literature

m.k.davies@reading.ac.uk

Overview

In 2017 I replaced the exam on a Part 3 module I convene (‘Margaret Atwood’) with an online learning journal assessment and I was so impressed with the students’ work that I sought funding to publish selected extracts in a UoR book, Second Sight: The Margaret Atwood Learning Journals. The project has involved collaboration between the Department of English Literature and the Department of Typography & Graphic Communication, and it has confirmed the value of staff-student partnerships, particularly in relation to celebrating student attainment and enhancing graduate employability.

Objectives

  • To showcase the achievements of our Part 3 students before they graduate and to memorialise their hard work, engagement and ingenuity in material form
  • To demonstrate at Open Days and Visit Days the quality of teaching and learning in the Department of English Literature in order to support student recruitment
  • To create a resource for students enrolling on the module in future years
  • To encourage reflection and conversation in my School regarding the value of diversified assessment practice

Context

The ‘Margaret Atwood’ module has always been assessed through an exam and a summative essay but I was dissatisfied with the work the exam produced (I knew that my students could perform better) so I researched alternative assessment formats. In 2017 I replaced the exam with a Blackboard learning journal because my research suggested that it offered the potential to release students’ creative criticality. I preserved the other half of the assessment model, the formal summative essay, because the module also needed an assessment where polished critical reading would be rewarded. With both assessment elements in place, students would need to demonstrate flexible writing skills and adapt to different writing environments (essential graduate skills). A manifest benefit of journal assessment is that it offers students to whom essay-writing does not come easily an opportunity to demonstrate their true ability and engagement so the decision to diversify assessment connected with inclusive practice.

Implementation

I decided to publish the students’ writing in a UoR book because I did not want to lose their hard work to a digital black-hole: it deserved a wider audience. I sought funding from our Teaching and Learning Deans, who supported the project from the beginning, and I connected with the ‘Real Jobs’ scheme in the Department of Typography & Graphic Communication where students gain valuable professional experience by managing funded publishing commissions for university staff and external clients. This put me in contact with a highly skilled student typographer with an exceptional eye for design. I asked a member of the ‘Margaret Atwood’ group to help me edit the book because I knew that she wanted to pursue a career in publishing and this project would provide invaluable material for her CV. Together we produced a ‘permissions’ form for students to formally indicate that they were releasing their work to the publication and 27 out of 36 students who were enrolled on the Spring Term module responded; all warmly welcomed the initiative. Contributors were asked to submit Word files containing their entries so as to preserve the confidentiality of their online submissions; this was important because the editors and designers were fellow students. Throughout the Summer Term 2018, the students and I met and planned, designed and edited, and the result is a book of which we are proud. With the sole exception of the Introduction which I wrote, every element of it, from the cover image to the design to the contents, is the work of our students.

Impact

The impact of the project will be registered in terms of Open Days because Second Sightwill help demonstrate the range of staff-student academic and employability activities in DEL. In addition, the project has consolidated connections between DEL and the Department of Typography & Graphic Communication and we will build on this relationship in the next session.

A further impact, which cannot be evidenced easily, is that it provides a useful resource for our graduates’ job applications and interviews: students entering publishing or journalism, for example, will be able to speak to their participation in the project and to their work in the book. The collection showcases some excellent writing and artwork and DEL graduates can attend interviews with tangible evidence of their achievements and abilities.

Reflections

Producing this book with such talented editors, designers and contributors has been a joy: like the ‘Margaret Atwood’ module itself, Second Sight confirms the pleasures and the rewards of working in partnership with our students.

The project sharpened my own editing skills and created a space to share knowledge about publishing conventions with the students who were assisting me. We all learned a great deal from each other: June Lin, the Typography student designer, gave me and the student editor (Bethany Barnett-Sanders) insights into the techniques of type-setting and page layout. To reciprocate, Bethany and I enhanced June’s knowledge of Margaret Atwood’s work which she had read but never studied. This pooling of knowledge worked to the benefit of us all.

One of the advantages of the learning journal was that it allowed me a clear view of the inventiveness and ingenuity that students bring to their work, and my sense of appreciation for their skill was further enhanced by working with students on the book. Technically, this was less of a ‘staff-student’ collaboration than it was a mutual education between several people.

The process we followed for acquiring written permission from students to include their work in the book, and for gathering Word files to avoid confidentiality issues, was smooth, quick, and could not have been improved. The only difficulty was finding time to edit seventy-five contributions to the book in an already busy term. Whilst this was not easy, the results of the collaboration have made it well and truly worth it.

Follow up

It is too early to tell whether other DEL colleagues will choose to diversify their own assessments and pursue a publishing project similar to the ‘Margaret Atwood’ example if they do. There is, however, a growing need for Open Day materials and Second Sight joins the Department’s Creative Writing Anthology to demonstrate that academic modules contain within them the potential for publication and collaborative initiative. I will certainly be looking to produce more publications of this nature on my other learning journal modules in the next session; in the meantime, copies of Second Sight will be taken with me to the outreach events I’m attending in July in order to demonstrate our commitment to student engagement, experience and employability here at the University of Reading.

Related entries

http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-using-focus-groups-to-diversify-assessment/

http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-using-focus-groups-to-diversify-assessment-part-2/

http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-in-student-participation-at-academic-conferences/

 

Engaging students in assessment design

Dr Maria Kambouri-Danos, Institute of Education

m.kambouridanos@reading.ac.uk

Year of activity 2016/17

Overview

This entry aims to share the experience of re-designing and evaluating assessment in collaboration with students. It explains the need for developing the new assessment design and then discusses the process of implementing and evaluating its appropriateness. It finally reflects on the impact of MCQ tests, when assessing students in higher education (HE), and the importance of engaging students as partners in the development of new assessment tools.

Objectives

  • To re-design assessment and remove a high-stakes assessment element.
  • To proactively engage ‘students as partners’ in the development and evaluation of the new assessment tool.
  • To identify the appropriateness of the new design and its impact on both students and staff.

Context

Child Development (ED3FCD) is the core module for the BA in Children’s Development and Learning (BACDL), meaning that a pass grade must be achieved on the first submission to gain a BA Honours degree classification (failing leads to an ordinary degree). The assessment needed to be redesigned as it put the total weight of students’ mark on one essay. As the programme director, I wanted to engage the students in the re-design process and evaluate the impact of the new design on both students and staff.

Implementation

After attending a session on ‘Effective Feedback: Ensuring Assessment and Feedback works for both Students and Staff Across a Programme’ I decided to explore more the idea of using Multiple Choice Tests (MCQ). To do so, I attended a session on ‘Team Based Learning (TBL)’ and another on ‘MCQ: More than just a Test of Information Recall’, to gather targeted knowledge about designing effective MCQ questions.

I realised that MCQ tests can help access students’ understanding and knowledge and also stimulate students’ active and self-managed learning. Guided by the idea of ‘assessment for learning’, I proposed the use of an MCQ test during a steering group meeting (employees and alumni) and a Board of Studies (BoS) meeting, which 2nd year Foundation Degree as well as BACDL student representatives attended. The idea was resisted initially, as MCQ tests are not traditionally used in HE education departments. However, after exploring different options and highlighting the advantages of MCQ tests, the agreement was unanimous. At the last BoS meeting (2016), students and staff finalised the proposal for the new design, proposing to use the MCQ test for 20% of the overall mark, keeping the essay for the remaining 80%.

At the beginning of 2017, I invited all BACDL students to anonymously post their thoughts and concerns about the new design (and the MCQ test) on Padlet. Based on these comments, I then worked closely with the programme’s student representatives and had regular meetings to discuss, plan and finalise the assessment design. We decided how to calculate the final mark (as the test was completed individually and then in a group) as well as the total number of questions, the duration of the test, etc.  A pilot study was then conducted during which a sample MCQ test was shared with all the students, asking them to practise and then provide feedback. This helped to decide the style of the questions used for the final test, an example of which is given below:

There are now more than one million learners in UK schools who speak English as an additional language (EAL). This represents a considerable proportion of the school population, well above 15 per cent. To help EAL children develop their English, teachers should do all the following, except…

a. use more pictures and photographs to help children make sense of new information.

b. use drama and role play to make learning memorable and encourage empathy.

c. maintain and develop the child’s first language alongside improving their English.

d. get children to work individually because getting them into groups will confuse them and make them feel bad for not understanding.

e. provide opportunities to talk before writing and use drills to help children memorise new language.

Impact

Students were highly engaged in the process of developing the new design, and the staff-student collaboration encouraged the development of bonds within the group. The students were excited with the opportunity to actively develop their own course and the experience empowered them to take ownership of their own learning. All of them agreed that they felt important and as a student representative said, “their voices were heard”.

The new design encouraged students to take the time to gauge what they already know and identify their strengths and weaknesses. Students themselves noted that the MCQ test helped them to develop their learning as it was an additional study opportunity. One of them commented that “…writing notes was a good preparation for the exam. The examination was a good learning experience.” Staff also agreed that the test enabled students to (re)evaluate their own performance and enhance their learning. One of the team members noted that the “…test was highly appropriate for the module as it offered an opportunity for students to demonstrate their proficiency against all of the learning outcomes”.

Reflections

The new assessment design was implemented successfully because listening to the students’ voice and responding to their feedback was an essential part of the designing process. Providing opportunities to both students and staff to offer their views and opinions and clearly recognising and responding to their needs were essential, as these measures empowered them and helped them to take ownership of their learning.

The BACDL experience suggests that MCQ tests can be adapted and used for different subject areas as well as to measure a great variety of educational objectives. Their flexibility means that they can be used for different levels of study or learning outcomes, from simple recall of knowledge to more complex levels, such as the student’s ability to analyse phenomena or apply principles to new situations.

However, good MCQ tests take time to develop. It is hoped that next year the process of developing the test will be less time-consuming as we already have a bank of questions that we could use. This will enable randomisation of questions which will also help to avoid misconduct. We are also investigating options that would allow for the test to be administered online, meaning that feedback could be offered immediately, reducing even further the time/effort required to mark the test.

Follow up

MCQ tests are not a panacea; just like any other type of assessment tool, MCQ tests have advantages and limitations. This project has confirmed that MCQ tests are adaptable and can be used for different subject areas as well as to measure a great variety of educational objectives. The evaluation of the assessment design will continue next year and further feedback will be collected by the cohort and next year’s student representatives.

Independent research and research dissemination in undergraduate teaching

Dr. Ute Woelfel, Literature and Languages
u.wolfel@reading.ac.uk
Year of activity: 2016/17

Overview

In order to improve students’ engagement, support their abilities as independent learners, and increase their feeling of ownership for their academic work, elements of independent research and research dissemination through the creation of research posters were included in a Part 2 module.

Objectives

  • Boost independent learning.
  • Nurture research interests.
  • Increase feeling of ownership.
  • Develop employability skills.

Context

In 2016/17 I introduced a new Part Two module on German National Cinema (GM2CG: 20 credits/ 30 contact hours). The module is intended to give students a general overview of German cinema from the end of World War I to German unification and at the same time allow sustained independent work on themes of interest. In order to increase the engagement with the themes, the independent work is research-oriented demanding from students to reflect their own expectations and aims, their goals for the module and indeed the course, and develop their own interest and approach.

Implementation

The students were asked in the beginning to pick a period or topic from a list and prepare a presentation. The presentation was not part of the summative assessment but served as a foundation for further research. After the presentation, individual discussions with each student were used to decide which aspect of the theme/topic the student would like to pursue further. After each term, essay surgeries were offered in which students were given the opportunity to discuss the research done so far and decide a concrete research question for their essay (2,500 words/ 30%). The students were then asked to turn the findings of their essays into research posters for dissemination to non-specialist audiences (10%). In order to make sure that students also gain a general understanding of German cinema, a final exam (60%) is scheduled in the summer term.

Impact

The inclusion of independent research elements was very successful in that students did engage more than they normally do when given set topics and essay titles. The majority of students found secondary sources, even additional primary sources, and often identified research topics they would like to pursue in the future. Both the essay and the exam marks were above average. The poster challenged students to re-think their academic findings and present them in a new, visually organised, format for interested general audiences; as we used the posters to showcase the students’ work at the University’s Languages Festival, the Visit Days and a Reading Scholars outreach event, a sense of the importance of their work emerged as well as pride in what they had achieved grew. The students understood the relevance of the poster for the development of professional skills.

Reflections

The module worked well and highlighted most of all the potential our students have and can develop in the right learning environment as well as their willingness to work hard when they are committed. Their engagement with independent research signalled a wish to get active and explore options beyond the set class texts rather than being spoon-fed; there is a clear need for feeling involved, responsible and in charge of work. I was particularly surprised about how much effort students were prepared to put into the presentations despite the fact that they did not count towards the module mark; as they were used as foundation for assessment, students clearly understood their benefit.

The research elements made the module learning and teaching intensive as a good number of office hours and slots during the enhancement weeks were used for individual discussions of research and essay topics; as I want the students to put their research posters to good use as well, additional feedback slots were offered in which I discussed not just marks but ways of improving the posters; students showed great willingness to work even further on their posters just to see them exhibited, despite the fact that any further input would not change the mark.

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 1)

Dr Madeleine Davies, School of Literature and Languages

Overview

The Department of English Literature (DEL) is organising student focus groups as part of our TLDF-funded ‘Diversifying Assessments’ project led by Dr Chloe Houston and Dr Madeleine Davies. This initiative is in dialogue with Curriculum Framework emphases engaging students in Programme Development and involving them as stakeholders. This entry outlines the preparatory steps taken to set up our focus groups, the feedback from the first meeting, and our initial responses to it.

Objectives

  • To involve students in developing a more varied suite of assessment methods in DEL.
  • To hear student views on existing assessment patterns and methods.
  • To gather student responses to electronic methods of assessment (including learning journals, blogs, vlogs and wikis).

Context

We wanted to use Curriculum Framework emphases on Programme Review and Development to address assessment practices in DEL. We had pre-identified areas where our current systems might usefully be reviewed and we decided to use student focus groups to provide valuable qualitative data about our practices so that we could make sure that any changes were informed by student consultation.

Implementation

I attended a People Development session ‘Conducting Focus Groups’ to gather targeted knowledge about setting up focus groups and about analytical models of feedback evaluation. I also attended a CQSD event, ‘Effective Feedback: Ensuring Assessment and Feedback works for both Students and Staff Across a Programme’, to gain new ideas about feedback practice.

I applied for and won TLDF mini-project funding to support the Diversifying Assessments project. The TLDF funding enabled us to regard student focus groups as a year long consultative process, supporting a review of assessment models and feedback practices in DEL.

In Spring Term 2017, I emailed our undergraduate students and attracted 11 students for the first focus group meeting. We aim to include as diverse a range of participants as possible in the three planned focus group meetings in 2016-17. We also aim to draw contributors from all parts of the undergraduate programme.

To prepare the first focus group:

  • I led a DEL staff development session on the Diversifying Assessment project at the School of Literature and Languages’ assessment and feedback away day; this helped me to identify key questions and topics with colleagues.
  • I conducted a quantitative audit of our assessment patterns and I presented this material to the staff session to illustrate the nature of the issues we aim to address. This tabulated demonstration of the situation enabled colleagues to see that the need for assessment and feedback review was undeniable.

At the first focus group meeting, topics and questions were introduced by the two project leaders and our graduate intern, Michael Lyons, took minutes. We were careful not to approach the group with clear answers already in mind: we used visual aids to open conversation (see figures 1 and 2) and to provide the broad base of key debates. We also used open-ended questions to encourage detail and elaboration.

Group discussion revealed a range of issues and opinions that we would not have been able to anticipate had we not held the focus group:

  • Students said that a module’s assessment pattern was the key determinant in their selection of modules.
  • Some students reported that they seek to avoid exams where possible at Part Two.
  • Discussing why they avoid exams, students said that the material they learn for exams does not ‘stick’ in the same way as material prepared for assessed essays and learning journals so they feel that exams are less helpful in terms of learning. Some stated that they do not believe that exams offer a fair assessment of their work.
  • Students wholly supported the use of learning journals because they spread the workload and because they facilitate learning. One issue the students emphasised, however, was that material supporting learning journals had to be thorough and clear.
  • Presentations were not rated as highly as a learning or assessment tool, though a connection with employability was recognised.
  • Assessed essays were a popular method of assessment: students said they were proud of the work they produced for summative essays and that only ‘bunched deadlines’ caused them problems (see below). This response was particularly marked at Part Two.
  • Following further discussion it emerged that our students had fewer complaints about the assessment models we used, or about the amount of assessment in the programme, than they did about the assessment feedback. This is represented below:

To open conversation, students placed a note on the scale. The question was, ‘Do we assess too much, about right, not enough?’ (‘About right’ was the clear winner).

Students placed a note on the scale: the question was, ‘Do we give you too much feedback, about right, or too little?’ (The responses favoured the scale between ‘about right’ and ‘too little’.)


The results of this exercise, together with our subsequent conversation, helped us to understand the importance of feedback to the Diversifying Assessment project; however, subsequent to the focus group meeting, the DEL Exams Board received an excellent report from our External Examiners who stated that our feedback practices are ‘exemplary’. We will disseminate this information to our students who, with no experience of feedback practices other than at the University of Reading, may not realise that DEL’s feedback is regarded as an example of best practice by colleagues from other institutions. We are also considering issuing our students with updates when assessed marking is underway so that they know when to expect their marks, and to demonstrate to them that we are always meeting the 15-day turnaround. The external examiners’ feedback will not, however, prevent us from continuing to reflect on our feedback processes in an effort to enhance them further.

Following the focus group meeting, we decided to test the feedback we had gathered by sending a whole cohort online survey: for this survey, we changed the ‘feedback’question slightly to encourage a more detailed and nuanced response. The results, which confirmed the focus group findings, are represented below (with thanks to Michael Lyons for producing these graphics for the project):

A total of 95 DEL students took part in the survey. 87% said they valued the opportunity to be assessed with diverse methods.

Assessed essays were the most popular method of assessment, followed by the learning journal. However, only a small proportion of students have been assessed with a learning journal, meaning it is likely that a high percentage of those who have been assessed this way stated it to be their preferred method of assessment.

On a scale from 0-10 (with 0 being too little, 5 about right, and 10 too much), the students gave an average score of 5.1 for the level of assessment on their programmes with 5 being both the mode and the median scores.

34% found the level of detail covered most useful in feedback, 23% the feedback on writing style, 16% the clarity of the feedback, and 13% its promptness. 7% cited other issues (e.g. ‘sensitivity’) and 7% did not respond to this question.

66% said they always submit formative essays, 18% do so regularly, 8% half of the time, 4% sometimes, and 4% never do.

40% said they always attend essay supervisions (tutorials) for their formative essays, 14% do so regularly, 10% half of the time, 22% sometimes, and 14% never do.

Impact

The focus group conversation suggested that the area on which we need to focus in DEL, in terms of diversification of assessment models, is Part Two assessment provision because Part One and Part Three already have more diversified assessments. However, students articulated important concerns about the ‘bunching’ of deadlines across the programme; it may be that we need to consider the timing of essay deadlines as much as we need to consider the assessment models themselves. This is a conversation that will be carried forward into the new academic year.

Impact 1: Working with the programme requirement (two different types of assessment per module), we plan to move more modules away from the 2000 word assessed essay and exam model that 80% of our Part Two modules have been using. We are now working towards an assessment landscape where, in the 2017-18 academic session, only 50% of Part Two modules will use this assessment pattern. The others will be using a variety of assessment models potentially including learning journals and assessed essays: assessed presentations and assessed essays: vlogs and exams: wikis, presentations and assessed essays: blogs and 5000 word module reports.

Impact 2: We will be solving the ‘bunched’ deadlines problem by producing an assessments spread-sheet that will plot each assessment point on each module to allow us to retain an overview of students’ workflow and to spread deadlines more evenly.

Impact 3: The next phase of the project will focus on the type, quality and delivery of feedback. Prior to the Focus Group, we had not realised how crucial this issue is, though the External Examiners’ 2017 report for DEL suggests that communication may be the more crucial factor in this regard. Nevertheless, we will disseminate the results of the online survey to colleagues and encourage more detail and more advice on writing style in feedback.

Anticipated impact 4: We are expecting enhanced attainment as a result of these changes because the new assessment methods, and the more even spread of assessment points, will allow students to present work that more accurately reflects their ability. Further, enhanced feedback will provide students with the learning tools to improve the quality of their work.

Reflections

Initially, I had some reservations about whether student focus groups could give us the reliable data we needed to underpin assessment changes in DEL. However, the combination of quantitative data (via the statistical audit I undertook and the online survey) and qualitative data (gathered via the focus groups and again by the online survey) has produced a dependable foundation. In addition, ensuring the inclusion of a diverse range of students in a focus group, drawn from all levels of the degree and from as many communities as possible within the cohort, is essential for the credibility of the subsequent analysis of responses. Thorough reporting is also essential as is the need to listen to what is being said: we had not fully appreciated how important the ‘bunched deadlines’, ‘exams’, and ‘feedback’ issues were to our students. Focus groups cannot succeed unless those convening them respond proactively to feedback.

Follow up

There will be two further DEL student focus group meetings, one in the Autumn Term 2017 (to provide feedback on our plans and to encourage reflection in the area of feedback) and one in the Spring Term 2018 (for a final consultation prior to implementation of new assessment strategies). It is worth adding that, though we have not yet advertised the Autumn Term focus group meeting, 6 students have already emailed me requesting a place on it. There is clearly an appetite to become involved in our assessment review and student contribution to this process has already revealed its value in terms of teaching and learning development.