Using rubrics to transform the marking and feedback experience – Professor Will Hughes

Professor Will Hughes has extensively used rubric grids within Grademark across of all of his modules to significantly enhance student engagement with his feedback, student understanding of his marking criteria and student attainment in subsequent essays whilst making his own experience of marking more efficient.

Profile picture for Prof. Hughes

My research interests include the control and management of building contracts, the management of design in construction, the analysis of organizational structure, and the analysis of procurement systems. The focus of my work is the commercial processes of structuring, negotiating, recording and enforcing business deals in construction. I have developed a framework for modelling and describing the myriad permutations of procurement variables, to aid in analysis and understanding of the complexities of organizing the procurement of built facilities. This has been incorporated into a British Standard (2011) on construction procurement.

OBJECTIVES

As convenor of a range of modules typically enrolling 120 students submitting around 3 pieces of work each year, I wanted to ensure that I had a really effective approach to marking and the provision of feedback. I wanted all of my students to engage fully in the feedback I provided, to thoroughly understand exactly what they had done well and where they could be making improvements after reading each piece that I provided. But I needed to achieve all of this in an effective and efficient way.

CONTEXT

National Student Survey results suggest that a significant number of students do not feel that they have access to marking criteria prior to submission and do not understand how to improve their performance based on the comments provided. Often the provision of more and
more free text feedback doesn’t appear to feed into higher levels of student attainment and satisfaction. At the same time, increasing student numbers and broader workload demands have increased pressures on all lecturers across the sector. In response I decided to adopt the use of rubric grids as one way to start to address these key issues.

IMPLEMENTATION

In 2015 I created a rubric grid in which I listed criteria along the left hand side and then unpicked what performance levels against each of those criteria might look like, describing different levels of performance in the lower 10 or 20 range all the way up to outstanding performance in the 90 or 100 range. It was extremely interesting to attempt a clear articulation of the differences between grades of failure and grades of excellence. Explaining, for example, the difference between 90 and 100% for a specific criterion is not something I had ever done before. A screenshot of a typical grid is shown below.

I actually created a slightly different grid for each piece of assessment but it would be equally possible to create a slightly less assessment specific grid that could be used across a whole module or even a whole programme.

Crucially, I shared the criteria for assessment with my students in the assignment brief so they knew, well ahead of submission, what the marking criteria themselves looked like.

I created all of this content in a standard Excel spreadsheet first and then clicked on the ‘rubric manager’ button and then ‘import’ to transfer my grid into Grademark. I could have created it directly within Grademark, in an incredibly simple process, by clicking on the ‘rubric’ icon, ‘rubric manager’, ‘rubric list’ and then ‘create new rubric’. I could then populate my grid with specific criteria and scales. By attaching the rubric grid to one assignment, Grademark automatically attaches the grid to all the assignments within the submission point.

This meant that each time I opened a piece of work in Grademark, I could click on the rubric icon to display the grid. I could then simply click the box against each criteria that applied to the particular assessment I was marking to show the student how they had performed in that particular skill.

In addition to using the rubric grid to classify student performance against individual marking criteria I would also provide in text comments and general comments in the free text feedback box to ensure really tailored and specific feedback content was also provided to all of my students. As I have become more experienced, I have tried to stop myself from adding in-text comments as it tends to result in detailed editing comments, which are not as helpful as feedback.

IMPACT

From the first time I used this approach, students have been enthusiastic. They have emailed my personally, as well as commenting in module evaluation forms, that they found the feedback more useful than anything they has received in their education to date. I no longer have students complaining that their mark is too low and asking whether I have made a mistake. Rather, those who would have complained begin by acknowledging that the mark is clear and well- justified and that they would like to discuss how to improve. This positive approach from students is refreshing.

REFLECTIONS

One of the things that made this activity successful was the prior development of a feedback library, which provided a wide-ranging list of comments to draw from and summarise. Another has been the move towards making comments positive rather than negative. It can be very difficult to focus on what students have done well in a poor submission. But it has proved to be the single most valuable thing. The performance of weak students improves significantly when they are given encouragement rather than discouragement. And strong students appreciate being given indications about how they could improve, as well, which, they tell, me is rare but welcome. I still have a way to go in making all of the comments positive and encouraging. If I were starting over, I would begin by spending time on thinking seriously about how to sound encouraging and positive when students submit very low-quality work. One thing to be careful about is that once the rubric has been attached to an assignment, it cannot be edited without being detached and losing all the grading. At first, I copied every mark into an Excel spreadsheet in case there were errors or omissions in my rubric that I hadn’t noticed until using it.

FOLLOW UP

Every piece of work I set up in Turnitin gives me the opportunity to fine tune the approach. Each piece of work has its own criteria for assessment, so I tend to develop the rubrics in Excel, making them easier to adapt for the next piece of work. This also makes it easy to share with colleagues. If anyone would like further examples, I would be happy to share more recent ones as Excel files.

LINKS

Simple illustrated instructions to create similar qualitative rubric within Grademark, as well as standard or custom rubric and free response grading forms, can be found here:

https://guides.turnitin.com/01_Manuals_and_Guides/Instructor_Guides/Turnitin_Classic_(Deprecated)/25_GradeMark/Rubrics_and_Grading_F orms

A six minute Grademark video demonstrating the creation of a similar rubric and other key features is available here:

https://www.youtube.com/watch?v=BAG44Fpm55o

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 2)

Dr Madeleine Davies and Michael Lyons, School of Literature and Languages

Overview

The Department of English Literature (DEL) has run two student focus groups and two whole-cohort surveys as part of our Teaching and Learning Development Fund‘Diversifying Assessments’ project. This is the second of two T&L Exchange entries on this topic. Click here for the first entry which outlines how the feedback received from students indicates that their module selection is informed by the assessment models that are used by individual modules. Underpinning these decisions is an attempt to avoid the ‘stress and anxiety’ that students connect with exams. The surprise of this second round of focus groups and surveys is the extent to which this appears to dominate students’ teaching and learning choices.

Objectives

  • The focus groups and surveys are used to gain feedback from DEL students about possible alternative forms of summative assessment to our standard assessed essay + exam model. This connects with the Curriculum Framework in its emphasis on Programme Review and also with the aims of the Assessment Project.
  • These forms of conversations are designed to discover student views on the problems with existing assessment patterns and methods, as well as their reasons for preferring alternatives to them.
  • The conversations are also being used to explore the extent to which electronic methods of assessment can address identified assessment problems.

Context

Having used focus groups and surveys to provide initial qualitative data on our assessment practices, we noticed a widespread preference for alternatives to traditional exams (particularly the Learning Journal), and decided to investigate the reasons for this further. The second focus group and subsequent survey sought to identify why the Learning Journal in particular is so favoured by students, and we were keen to explore whether teaching and learning aims were perceived by students to be better achieved via this method than by the traditional exam. We also took the opportunity to ask students what they value most in feedback: the first focus group and survey had touched on this but we decided this time to give students the opportunity to select four elements of feedback which they could rank in order or priority. This produced more nuanced data.

Implementation

  • A second focus group was convened to gather more detailed views on the negative attitudes towards exams, and to debate alternatives to this traditional assessment method.
  • A series of questions was asked to generate data and dialogue.
  • A Survey Monkey was circulated to all DEL students with the same series of questions as those used for the focus group in order to determine whether the focus group’s responses were representative of the wider cohort.
  •  The Survey Monkey results are presented below. The numbers refer to student responses to a category (eg. graphic 1, 50 students selected option (b). Graphic 2 and graphic 5 allowed students to rank their responses in order or priority.

Results

  • Whilst only 17% in the focus group preferred to keep to the traditional exam + assessed essay method, the survey found the aversion to exams to be more prominent. 88% of students preferred the Learning Journal over the exam, and 88% cited the likelihood of reducing stress and anxiety as a reason for this preference.
  • Furthermore, none of the survey respondents wanted to retain the traditional exam + assessed essay method, and 52% were in favour of a three-way split between types of assessment; this reflects a desire for significant diversity in assessment methods.
  • We find it helpful to know precisely what students want in terms of feedback: ‘a clear indication of errors and potential solutions’ was the overwhelming response. ‘Feedback that intersects with the Module Rubric’ was the second highest scorer (presumably a connection between the two was identified by students).
  • The students in the focus group mentioned a desire to choose assessment methods within modules on an individual basis. This may be one issue in which student choice and pedagogy may not be entirely compatible (see below).
  • Assessed Essay method: the results seem to indicate that replacing an exam with a second assessed essay is favoured across the Programme rather than being pinned to one Part.

Reflections

The results in the ‘Feedback’ sections are valuable for DEL: they indicate that clarity, diagnosis, and solutions-focused comments are key. In addressing our feedback conventions and practices, this input will help us to reflect on what we are doing when we give students feedback on their work.

The results of the focus group and of the subsequent survey do, however, raise some concerns about the potential conflict between ‘student choice’ and pedagogical practice. Students indicate that they not only want to avoid exams because of ‘stress’, but that they would also like to be able to select assessment methods within modules. This poses problems because marks are in part produced ‘against’ the rest of the batch: if the ‘base-line’ is removed by allowing students to choose assessment models, we would lack one of the main indicators of level.

In addition, the aims of some modules are best measured using exams. Convenors need to consider whether a student’s work can be assessed in non-exam formats but, if an exam is the best test of teaching and learning, it should be retained, regardless of student choice.

If, however, students overwhelmingly choose non-exam-based modules, this would leave modules retaining an exam in a vulnerable position. The aim of this project is to find ways to diversify our assessments, but this could leave modules that retain traditional assessment patterns vulnerable to students deselecting them. This may have implications for benchmarking.

It may also be the case that the attempt to avoid ‘stress’ is not necessarily in students’ best interests. The workplace is not a stress-free zone and it is part of the university’s mission to produce resilient, employable graduates. Removing all ‘stress’ triggers may not be the best way to achieve this.

Follow up

  • DEL will convene a third focus group meeting in the Spring Term.
  • The co-leaders of the ‘Diversifying Assessments’ project will present the findings of the focus groups and surveys to DEL in a presentation. We will outline the results of our work and call on colleagues to reflect on the assessment models used on their modules with a view to volunteering to adopt different models if they think this appropriate to the teaching and learning aims of their modules
  • This should produce an overall assessment landscape that corresponds to students’ request for ‘three-way’ (at least) diversification of assessment.
  • The new landscape will be presented to the third focus group for final feedback.

Links

With thanks to Lauren McCann of TEL for sending me the first link which includes a summary of students’ responses to various types of ‘new’ assessment formats.

https://www.facultyfocus.com/articles/online-education/assessment-strategies-students-prefer/

Conclusions (May 2018)

The ‘Diversifying Assessment in DEL’ TLDF Mini-Project revealed several compelling reasons for reflecting upon assessment practice within a traditional Humanities discipline (English Literature):

  1. Diversified cohort: HEIs are recruiting students from a wide variety of socio-cultural, economic and educational backgrounds and assessment practice needs to accommodate this newly diversified cohort.
  2. Employability: DEL students have always acquired advanced skills in formal essay-writing but graduates need to be flexible in terms of their writing competencies. Diversifying assessment to include formats involving blog-writing, report-writing, presentation preparation, persuasive writing, and creative writing produces agile students who are comfortable working within a variety of communication formats.
  3. Module specific attainment: the assessment conventions in DEL, particularly at Part 2, have a standardised assessment format (33% assessed essay and 67% exam). The ‘Diversifying Assessment’ project revealed the extent to which module leaders need to reflect on the intended learning outcomes of their modules and to design assessments that are best suited to the attainment of them.
  4. Feedback: the student focus groups convened for the ‘Diversifying Assessment’ project returned repeatedly to the issue of feedback. Conversations about feedback will continue in DEL, particularly in relation to discussions around the Curriculum Framework.
  5. Digitalisation: eSFG (via EMA) has increased the visibility of a variety of potential digital assessment formats (for example, Blackboard Learning Journals, Wikis and Blogs). This supports diversification of assessment and it also supports our students’ digital skills (essential for employability).
  6. Student satisfaction: while colleagues should not feel pressured by student choice (which is not always modelled on academic considerations), there is clearly a desire among our students for more varied methods of assessment. One Focus Group student argued that fees had changed the way students view exams: students’ significant financial investment in their degrees has caused exams to be considered unacceptably ‘high risk’. The project revealed the extent to which Schools need to reflect on the many differences made by the new fees landscape, most of which are invisible to us.
  7. Focus Groups: the Project demonstrated the value of convening student focus groups and of listening to students’ attitudes and responses.
  8. Impact: one Part 2 module has moved away from an exam and towards a Learning Journal as a result of the project and it is hoped that more Part 2 module convenors will similarly decide to reflect on their assessment formats. The DEL project will be rolled out School-wide in the next session to encourage further conversations about assessment, feedback and diversification. It is hoped that these actions will contribute to Curriculum Framework activity in DEL and that they will generate a more diversified assessment landscape in the School.

Using grademark to write high quality feedback more rapidly in the school of law – Dr Annika Newnham

Profile picture for Dr Newnham

Dr Newnham is the module convenor for LLB Family Law. Her areas of interest include, Child Law, Autopoietic Theory, The Common Intention Constructive Trust.

Since 2015, Annika has gradually personalised the ‘Quickmarks’ function within Turnitin Grademark to be both discipline specific and also assignment specific. In addition, Dr Newnham has also developed a lengthy comments bank which she can draw on and personalise to ensure that she can write high quality feedback more quickly, speeding up the entire marking process.

OBJECTIVES

The School of Law currently operates online submission, marking and feedback for the vast majority of assessed work. As part of this process it makes extensive use of Turnitin Grademark and some of the functionality on offer, including Quickmarks. Given the large numbers of students submitting work within the School and the need to provide high quality feedback quickly, I wanted to use these new tools to speed up the entire marking process and support the quality and quantity of feedback offered.

CONTEXT

The School of Law recruits strongly, makes extensive use of summative assessment and maintains a large number of core modules. Online assessment has been adopted, in part, to help support the continued provision of high quality feedback in this context while ensuring that feedback is returned to students within 15 working days.

IMPLEMENTATION

Grademark allows for the customisation of Quickmarks by individuals markers. I very quickly began to customise the Quickmarks that were available to me by adding comments that I make frequently. Gradually, over time, my Quickmarks section has expanded to include a whole series of comments which range from just a few words to more lengthy sections of text. Dragging these across to relevant sections of text saves me a considerable amount of time because I’m not writing out the same type of comment again and again. I’ve even developed my set of Quickmarks to be specific not only to each module I teach but to each assignment I mark within that module. I carefully save each set with a different name so I can easily access them again. Grademark even remembers my Quickmarks sets from one year to the next so my collection appears automatically when I open each new essay.

I wanted to explore the possibilities of reducing marking time whilst maintaining the quality and quantity of feedback in other areas.
This approach worked very well for targeted in text comments throughout the essay but, like most markers, I also leave summative text in the general comments section in the Grademark sidebar so that students have a sense of my overall thoughts. I started to compile a lengthy list of comments that I use extensively in a simple and separate word document. I ordered each set under key headings. Some of these are generic for all essays: writing style, referencing, structure and so on. There are also sets of comments on how students have tackled a particular issue in law, for example how well they have presented balanced arguments on commercial surrogacy, or have understood the different stages of a cohabitant’s claim for a share in her ex-partner’s house. Each heading contains 8-10 different sentences or longer sections covering a wide range of different areas I may want to comment on. I am then able to cut and paste the most relevant into the Grademark comment box and, if needed, rewrite to suit the exact statement for the specific essay I’m working on. This process has become even more efficient since the arrival of a second screen. I can list my commonly used statement on the left hand screen, cut and paste or drag over to the actual essay on my right hand screen. Although I might then want to personalise the statement I still save a significant amount of time in comparison to typing everything out repeatedly for each essay.

IMPACT

I maintain a balance between the use of Quickmarks, my comments bank and specific comments written for each piece of work. Students should not receive exactly the same comments time and time again. Feedback should not become a highly mechanised process. But Quickmarks and comments banks can be used as a starting point or work alongside very specific comments written for a particular piece of work. In this way I can maintain the quality and quantity of my feedback whilst speeding up the marking process considerably. In particular, this approach seems to ensure greater consistency between essays in terms of the amount of feedback that each student receives because it is so much quicker and easier to insert comments. More broadly it feels like a far more efficient process and is certainly a more fulfilling task to undertake.

REFLECTIONS

Quickmarks and cut and paste comments have made marking feel much less like a chore; and removes the irritation often felt if you have to correct the same misunderstanding again and again to different students.

LINKS

Turnitin Quickmark

https://guides.turnitin.com/01_Manuals_and_Guides/Instructor_Gu ides/Turnitin_Classic_for_Instructors/25_GradeMark/QuickMark

Feedback via audiofiles in the Department of English Literature – Professor Cindy Becker

Profile picture for Prof. Becker

Cindy Becker is the Director of Teaching and Learning for the School of Literature and Languages and also teaches in the Department of English Literature. She is a Senior Fellow of the Higher Education Academy and has been awarded a University of Reading Teaching Fellowship. She is an enthusiastic member of several University Communities of Practice: Placement Tutors, University Teaching Fellows, Technology Enhanced Learning, and Student Engagement Champions.

Cindy is a member of Senate and has sat on university steering committees and working parties; she is also a member of the Management Committee for the School of Literature and Languages and chair the School Board for Teaching and Learning. She is the convenor of Packaging Literature and Shakespeare on Film.

In September 2015 she started to trial the use of the audio feedback function within Turnitin’s online marking tool (GradeMark). This innovative approach did present some initial challenges but, overall, it proved to be a great success for both Cindy and her students.

OBJECTIVES

GradeMark was introduced to the University in the Summer of 2015. I wanted to use this new marking tool to explore different ways of providing feedback for students. In particular, I wanted to adopt a more personal approach and provide more in-depth feedback without significantly increasing the time I spend marking each essay.

CONTEXT

GradeMark allows you to produce typewritten feedback for assessment work and this is what most of us are used to. However, it will also let you click on an icon that allows you to create an audio file of up to three minutes of spoken feedback instead.

IMPLEMENTATION

I started off by making notes as I marked the essay and then talking through them on the audio file. It did not work very well because my feedback became stilted, took longer than three minutes and was time consuming to prepare. I think I lacked confidence at the outset.

Now I take a more relaxed approach. I make no more than a couple of notes (and often not even that) and then I simply press the record button. As I talk to the student I scroll down the assignments on the split screen and this is enough to jog my memory as to what I want to say. Taking a methodical approach has helped me. I always begin with an overview, then work on specific challenges or praiseworthy elements, then end with a brief comment summing up my thoughts. If it
goes wrong, I simply scrap the recording and begin again. I save myself time with the uploading by setting it to upload and then begin to work on the next assignments. This saves the frustration of staring at an upload symbol for ages when you want to get on with it.

IMPACT

It is worth the effort.

For now, students love it. I asked students to let me know whether they would prefer written or audio file feedback and those who responded voted for audio file. The novelty factor might wear off, but I think at the moment it is a useful way to engage students in our assessment criteria and module learning aims, in class and beyond.

For now, I love it. It is a pleasant change; it is quicker and fuller than written feedback. It seems to allow me to range more widely and be more personally responsive to students through their assignments. Because I am ‘talking to them’ I have found myself more ready to suggest other modules they might like, or some further reading that they might enjoy.

REFLECTIONS

It can take a few attempts to ensure that your headphones are working within the system. This is most usually a problem with GradeMark or Blackboard more generally – restarting Blackboard or even your computer will fix it. You might not have headphones already to hand, and that sounds like another investment of time and money, but it’s good idea to buy cheap headphones – they cost around £20 from a supermarket and are perfectly adequate for the job. You feel like a twit talking to your computer. Of course you do – who wouldn’t? After your first few audio files it will feel perfectly natural.

For the future, I can see it having an impact on assignment tutorials. I believe I can have an equal impact via a tutorial or a three minute audio file, and everyone actually listens to their audio file. I am going to have to decide what to do with the extra ‘spare’ contact time this might give
me…

Changing the assessment experience of professional staff in SAPD – Emily Parsons

Profile picture for Emily Parsons

Emily Parsons is a Senior Programme Administrator in the School of Agriculture, Policy and Development (SAPD). Online assessment has been adopted throughout the SAPD, impacting academic and non- academic colleagues. In this case study, Emily outlines the experiences of her Support Centre team working with SAPD as an Early Adopter School.

OBJECTIVES

To reduce the administrative burden of assessment and improve the overall assessment experience for staff within the Support Centre whilst supporting change within the School.

CONTEXT

The University has a long-term vision to move toward online assessment, where practical, and improve underlying processes. SAPD became an Early Adopter School in May 2017 which allowed the EMA Programme to support a significant shift away from a mixture of online and offline marking to the full provision of online marking where practical. The SAPD Support Centre was involved right from the start working collaboratively with the EMA, TEL, CQSD and senior school leadership team during the change process. The Support Centre was one of the first to experience the impact on their working practices of a shift towards greater online marking throughout 2017-2018.

IMPLEMENTATION

As an Early Adopter School, SAPD undertook a full change programme to support online submission, feedback and grading as well as support for all underlying processes. A series of meetings and three major workshops lasting between three and four hours were held throughout the Summer involving all collaborating teams.

Initially only two members of the Support Centre team were involved but representation quickly expanded to include at least four members. It was really important to make sure that a range of professional staff views were being heard during the change planning stage particularly because all of these colleagues would play a role in implementing new processes and delivering change.

Each of these collaborative workshop meetings drew everyone together, in person, in one room instead of relying on e-mail correspondence. This proved far more effective. Relying on e-mail could have significantly delayed the process and may not have led to the kind of in depth, rich discussion around assessment practice, process and policy within the School that was seen at each meeting.

One of the triggers for these debates was the creation of a series of highly detailed process flow diagrams showing the end to end assessment process within the University. These process maps outlined who does what and when in four main diagrams – anonymous marking using the Blackboard marking tool, named marking using the Blackboard tool, anonymous marking using Turnitin, and named marking using Turnitin. These maps were essential to understanding the end to end process and for allowing the School to start thinking about consistent practices.

Following this approach to consistent practice professional staff also created a manual containing essential information such as how to set up submission points or Turnitin similarity reports in the way that the School wanted. All professional staff could then follow this detailed guidance. This proved essential to ensure that all colleagues were working in a similar way.

IMPACT

Two key areas of impact have been experienced within the Support Centre – the first surrounds the adoption of more consistent processes to deal with the submission, receipt, marking and moderation of coursework, and the second surrounds the significant increase in amount of work marked online.

The adoption of more consistent processes was made possible by the creation of the detailed process diagrams outlined above. These show the 45-50 steps involved from submission to final exam board agreement and confirmation, including who does what exactly, and when. The creation of these process diagrams during the Summer workshops, informed by all of the groups involved was, in itself, a useful exercise. We could take a step back and really think about how we could make this process as efficient and as effective as possible whilst keeping an element of flexibility to cover any type of submission or new requirement that we collectively hadn’t thought of!

During the workshops, the Support Centre, in collaboration with the School, was also asked to create a large assessment spreadsheet listing all submissions due to be submitted during the academic year. The creation of this detailed assessment spreadsheet, in itself, provided an opportunity for colleagues to pause and review the amount of assessment and the School’s use of different assessment types.

This was also a crucial starting point from which we could categorise assessment types (such as group work, individual essay, video submission) and then think through which of the two marking tools – Blackboard or Turnitin – would be most appropriate for each type. Both the process diagrams together with these spreadsheets helped to support workflow and planning within the Support Centres who then knew exactly what they had to do and when, for the full academic year.

Under the new, more consistent. processes, academic colleagues were no longer required to create submission points. This role was transferred to professional staff and actually represented one of the most significant changes undertaken. All submission points are now created in the same way -for example there is no longer any variation within the School surrounding student views of Turnitin reports as all students only see similarity reports after the submission deadline. In general, academic colleagues were happy to transfer the set-up of submission points to professional staff and just had to inform the Support Centre, in advance, when assessment was due. Around 400 pieces of assessment were due during 2017- 2018.

Alongside increased consistency surrounding processes, the School has seen significant increases in the amount of work submitted and marked online. Overall this change has improved the assessment experience for colleagues within the Support Centre in a number of ways:

• Previously, using a rota system, colleagues were allocated a time slot to sit in the front office to receive hard copies and process each paper coming in. This was an intense role and so reduced the time available to undertake any other supporting role. There is no need to do this in the current system as submission is managed online for almost all work. This represents a significant time saving for colleagues.

• At the end of the marking process, each paper would also have to be sorted alphabetically and placed in individual envelopes, ready for collection by students. This doesn’t happen now for the vast majority of pieces which are accessed online. In the past this role might have taken half a day. Now it takes an estimated 30 minutes for the small amount of assessment still marked in hard copy. The time saved has been described by professional staff within the team as “extraordinary”.

• This also means that the assessment process has become much more scalable. Support Centres can cope with increases in students without seeing significant increase in workload.

• The Support Centre used to ask academic colleagues to return marked work to them within 14 working days of submission to allow time for processing. There is no need to do this anymore because the marks and feedback are returned online so academic colleagues now have the full 15 working days to mark submitted work,

• The Support Centre is no longer drowning in a sea of paper leaving much more room and saving storage space. This was a particular problem when students failed to come back to collect their work.

• Some of the functions of the marking tools are saving a significant amount of time for the Support Centre. One example surrounds non-submission. It took a considerable amount of time to contact students who had failed to submit work when they were submitting hard copies. Now Turnitin allows professional staff to send one e-mail to all non-submitters easily and very quickly.

• Previously, in order to undertake internal moderation, Support Centre staff would release marks but keep the hardcopy coursework, which included their feedback, back from the students until internal moderation had taken place. After this point, the full feedback would also be released. In order to undertake external moderation, Part 2 and Part 3 students were asked to create a portfolio of their work, including marks and feedback, and submit this at the end of the academic year so that external examiners could review the work. Student engagement in this process was variable with some students having lost their work by this point. In addition, these processes generated a huge amount of paper and took a large number of working hours to manage. This isn’t necessary anymore, aside from a very small amount of fieldtrip work. Internal and external moderators can access both marks and feedback quickly and easily online, from wherever they are in the country.

REFLECTIONS

Moving the School towards more consistent approaches to managing assessment and increasing online marking and feedback has largely been a very positive experience for the Support Centre. We are now enjoying a range of benefits which have made our role within the assessment cycle much more manageable.

We had worried that some areas of work might increase – for example, we might have seen more reported cases of academic misconduct as a result of much greater use of Turnitin similarity reports. This has not occurred but the School had been undertaking a significant amount of work in this area including the introduction of a formative piece of work at Part 1 and at the start of the MSc programmes which is then analysed during follow on seminars.

As we move forward into the next academic year, there are still some areas that we need to think about a little more. We’ve discovered through this processes, for example, that there are multiple different ways in which academic colleagues assess and give feedback on presentations. We need to work on understanding the processes in this area more in 2018-2019.

This year we will also be able to start the process of collecting new assessment data and deadlines much earlier. This will enable us to create submission points around July and August. This will place us in a better position to plan ahead for 2018-2019.

Reflecting on change and the management of non-standard submissions in Typography – Dr Jeanne-Louise Moys

Profile picture for Dr Moys

Jeanne-Louise teaches design practice, theory and research skills across a range of genres and platforms. She is the Programme Director for the MA Creative Enterprise and the Pathway Lead for the MA Communication Design (Information Design Pathway).

OBJECTIVES

Typography has been keen to continue to support the move from offline to online submission, feedback and grading, where possible. In particular, the Department has wanted to ensure a more consistent and streamlined approach to managing assessment, especially given the range of diverse submission types within Typography programmes. The Department were also very keen to ensure that online marking tools allowed colleagues to provide feedback that supports students’ design literacy. In this respect, markers aim to give feedback designed to allow for openness in the ways students think and that builds students’ confidence to develop their own design judgement.

CONTEXT

The University has a long-term vision to move toward online assessment, where practical, and improve underlying processes. In 2015–6, the Department of Typography adopted a policy of either online submission or dual submission (where students are asked to submit both an online digital ‘copy’ and in material form as relevant to the particular deliverables of different design briefs) across the undergraduate degree. Paper-based feedback forms were replaced with online rubrics. The Department mainly made use of Blackboard as a marking tool but with some further use of
Turnitin, particularly for essay based assessment. The Department has undertaken this change in the context of growing student numbers, increasing diversity of student cohorts and growing numbers of international students. The trends have increased the need to adopt more efficient and streamlined assessment processes.

IMPLEMENTATION

Over the past four years the Department has supported student online submission and the increased use of marking tools. In 2014, The Head of Department and I initially worked together to explore different online tools to find sustainable assessment practices for increasing cohorts. We liaised with our IT partners who encouraged us to work with Maria Papaefthimiou – as they were aware that the University was setting up a new TEL team. Maria introduced us to Blackboard rubrics, which we piloted for both practical and written forms of assessment.

These early initiatives were reviewed ahead of our decision to adopt online assessment for all undergraduate coursework (with a few exceptions such as technical tasks, examinations and tasks where self or peer assessment plays a particular role in the learning process). I then translated our paper-based forms into a set of Blackboard rubric templates for colleagues to work with and provided a workshop and video resources to support the transition.

For almost every submitted piece of work, students receive feedback from colleagues using either Turnitin or the Blackboard marking tool. Each piece has an online submission point so that colleagues can provide feedback online, often using the rubrics function within the Blackboard marking tool.

One of the challenges faced by the Department has been managing non-standard types of submission. Typography employs a particularly broad range of assessment types including self- and peer-assessment and group work. It also handles a range of different physical submissions such as books or posters and assessment involving creating designs like websites and app prototypes that exist only in digital form.

Because of the nature of the work, dual submission is common. Our policy of online submission for written work and dual submission for practical work ensures that – regardless of the nature of the work – students receive feedback and grades in a consistent manner throughout their degree.

More recently, we have introduced some new practices that support the development of professional skills and enhance the transparency of group work. For example, professional practice assignments use a project management app, Trello. Students are assessed on their usage and the content (including reflection) they input into the app. The tutor can, for example, set up a Trello group and monitor group activity. Some practical modules require students to use prototyping software or create videos. In these cases, it might be easier for students to share links to this content either by submitting the link itself online to Blackboard or to a dedicated Typography submission e-mail address monitored by administrative colleagues (although this second approach may change as we work with the EMA Team).

A second issue faced by the Department during implementation, as a result of the significant diversity of assessment, is that the management of online submission can become confusing for students in terms of what exactly they should submit and how. The diversity of assessment allows students to demonstrate a range of learning outcomes and broad skills base but the Department has had to ensure that students fully understand the range of submission practices. This challenge exists both in Part 1 when students are being introduced to new practices and in Parts 2 and 3 where a single design brief may have multiple deliverables. We are continually working to find the best balance between ensuring the kind of submission is always appropriate to the learning outcomes, provides students with experience in industry standard software and tools, and is accompanied by clear guidance about submission requirements.

IMPACT

The shift from offline to online assessment within the Department has led to a range of changes to the staff and student experience:

1. Online feedback for students has meant that they now always know where their feedback is. There is no need for them to contact their tutors to access content.

2. For some staff, the use of online marking and feedback has meant spending some time getting used to the interface and learning about the functionality of the tools, particularly the Blackboard marking tool. There have been some issues surrounding the accessibility of rubrics within Blackboard and their consistent use, which the Department has had to work through. In general colleagues are now reporting that online marking has significantly reduced marking time, especially where more detailed rubrics have
been developed and trialled in the current academic year.

3. The Department has spent time thinking carefully about the consistency of the student assessment experience and making the most of the functionality of the tools to make marking easier and, potentially, quicker. As a result, there is a sense that the practices adopted are more sustainable and streamlined, which has been important given rising student numbers and increasingly diverse cohorts.

REFLECTIONS

Over the last year, following recommendations from Periodic Review, the Department has been trialling different practices such as the creation of much more detailed rubrics. As noted above, detailed rubrics seem to reduce marking and feedback time, while providing students with more clarity about the specific criteria used to assess individual projects. However, these do not always accommodate the range of ways in which students can achieve the learning outcomes for creative briefs or encourage the design literacy and independent judgment we want students to develop.
We are also working on ensuring that the terminology used in these rubrics is mapped appropriately to the level of professional skill expected in each part of the degree. The Department is currently looking at the impact of this activity to identify best practice.

Typography is keen to continue to provide a range of assessment options necessary for developing professional skills and industry- relevant portfolios within the discipline. We are committed to complementing this diversity with an assessment and feedback process that gives students a reassuring level of consistency and enables them to evaluate their performance across modules.
There is some scope to develop the marking tools being used. It would, for example, be very helpful if Blackboard could develop a feature where students can access their feedback before they can
see their marks or if it allowed colleagues to give a banded mark (such as 60-64), which is appropriate formative feedback in some modules. In addition, Typography students have reported that the user experience could be improved and that the interface could be more intuitive. For example, it could contain less layers of information and access to feedback and marks might be more direct.

More broadly, the shift from offline to online practices has been one driver for the Department to reflect on existing assessment practices. In particular, we have begun to consider how we can better support students’ assessment literacy and have engaged with students to review new practices. Their feedback, in combination with our broader engagement with the new Curriculum Framework and its impact on Programme Level Assessment, is informing the development of a new set of rubric templates to be adopted in autumn 2018.

LINKS

For further information please see the short blog, ‘Curriculum Review in Practice Aligning to the Curriculum Framework-first steps started at:
http://blogs.reading.ac.uk/engage-in-teaching-and- learning/2018/04/09/curriculum-review-in-practice-aligning-to- the-curriculum-framework-first-steps-started-by-jeanne-louise- moys-rob-banham-james-lloyd/

Pre-sessional English use of Turnitin’s online marking tool – Rob Playfair, IFP Course Tutor

OBJECTIVES

I was interested in improving the efficiency of my marking, and liked the idea of having a digital record of written feedback to students. During the PSE induction for new tutors we were told that the University is moving towards e-feedback over the next few years so it seemed like a useful skill to acquire.

CONTEXT

My group of international students were on a 9 week course to improve their level of English before starting their postgraduate studies. They needed to write three 500 word essays and one 1500 word project. For each of these, students wrote two drafts. I needed to provide written feedback on both drafts and the final version of each essay, i.e. a lot of marking!

IMPLEMENTATION

Jonathan Smith, PSE course director and ISLI TEL Director, gave all teachers a one-hour workshop on how to use Turnitin and Grademark, during which we had a chance to get hands on with the software. Each year Jonathan runs a training session for new members of PSE staff who will work on the PSE courses during the summer.

Later, Jonathan shared the PSE ‘QuickMarks’, with those of us who had opted to use e- feedback. We could download these, via our QuickMarks library, into our own personal QuickMarks set. These comments were then available each time we opened an essay.

The QuickMarks focussed on common student errors with explanations and links to relevant sources. ‘Quickmarks’ are based not only on common grammar and lexical errors but also on the complexity of the structures used and coherence and cohesion in the texts.

Students grew accustomed to submitting work, accessing feedback and seeing their progress.

IMPACT

• It was quicker to note common student errors in-text using the QuickMarks, than repeatedly hand writing the same comments.

• Students were able to read & start acting on my feedback as soon as I did it, rather than waiting until the next class.

• I could quickly refer to previous drafts and the comments I had given to monitor uptake.

• I could browse work from students who were not in my class, via the Turnitin feedback suite, to see a broader range of essays and also see the feedback that colleagues were giving because, in this case, the point of submission was the same for the whole cohort. As this was my first experience teaching the programme, this was particularly useful.

REFLECTIONS

The speed of communication with students was the biggest benefit – as soon as my marking was done students could see it. This meant that students could formulate questions about my feedback before class, making the time in class much more productive.

In terms of quality of marking I think there might be a tendency to over-mark using the QuickMarks, because it only takes a second to add a one yet creates quite a lot for the student to do – reading an explanation and perhaps visiting a website. I’d like to explore the impact of this on uptake.

Finally, on a practical level I found this helped my organisation – all the scripts, scores and comments are in one place. It was also easier to submit scripts for moderation: I just gave the names of students to the Course Directors who could go into the system and see the scripts themselves.

FOLLOW UP

• I’m currently using it in a similar way on the International Foundation Programme (IFP).

• At present all students can do is upload their work then download my comments. I’d be interested in a function which allows students to respond to my comments – making corrections or asking questions. This would support the feedback cycle.

• To improve the reliability of the summative scores, I wonder whether we can learn from elements of comparative judgment programmes such as No More Marking.

LINKS

www.nomoremarking.com

http://www.reading.ac.uk/internal/ema/ema-news.aspx

https://www.reading.ac.uk/ISLI/study-in-the-uk/isli-pre-sessional-english.aspx

Rethinking assessment design, to improve the student/staff experience when dealing with video submissions

Rachel Warner, School of Arts and Communication Design

Rachel.Warner@pgr.reading.ac.uk

Jacqueline Fairbairn, Centre for Quality Support and Development

j.fairbairn@reading.ac.uk

Overview

Rachel in Typography and Graphic Communication (T&GC) worked with the Technology Enhanced Learning (TEL) team to rethink an assignment workflow, to improve the student/staff experience when dealing with video submissions. Changes were made to address student assessment literacies, develop articulation skills, support integration between practice and reflection, and make use of OneDrive to streamline the archiving and sharing of video submissions via Blackboard.

This work resulted in students developing professional ‘work skills’ through the assessment process and the production of a toolkit to support future video assessments.

Objectives

  • Improve staff and student experiences when dealing with video assignment submissions. Specifically, streamlining workflows by improving student assessment literacy and making use of university OneDrive accounts.
  • Support students to develop professional skills for the future, through assessment design (developing digital literacies and communication skills).
  • Provide an authentic assessment experience, in which students self-select technologies (choosing software and a task to demonstrate) to answer a brief.

Context

The activity was undertaken for Part 1 students learning skills in design software (e.g. Adobe Creative apps). The assignment required students to submit a ‘screencast’ video recording that demonstrated a small task using design software.

Rachel wanted to review the process for submitting video work for e-assessment, and find ways to streamline the time intensive marking process, particularly in accessing and reviewing video files, without compromising good assessment practice. This is also acknowledged by Jeanne-Louise Moys, T&GC’s assessment and feedback champion: “Video submissions help our students directly demonstrate the application of knowledge and creative thinking to their design and technical decisions. They can be time-consuming to mark so finding ways to streamline this process is a priority given our need to maintain quality practices while adapting to larger cohorts.’”

The TEL team was initially consulted to explore processes for handling video submissions in Blackboard, and to discuss implications on staff time (in terms of supporting students, archiving material and accessing videos for marking). Designing formative support and improving the assessment literacy of students was also a key driver to reduce the number of queries and technical issues when working with video technologies.

Implementation

Rachel consulted TEL, to discuss:

  • balancing the pedagogic implications of altering the assignment
  • technical implications, such as submission to Blackboard and storage of video

To address the issue of storing video work, students were asked make use of OneDrive areas to store and submit work (via ‘share’ links). Use of OneDrive encouraged professional behaviours such as adopting a systematic approach to file naming, and it meant the videos were securely stored on university systems using a well-recognised industry standard platform.

To further encourage professional working, students were required to create a social media account to share their video. YouTube was recommended; it is used prolifically by designers to showcase work and portfolios, and across wider professional settings.

Students were provided with a digital coversheet to submit URLs for both the OneDrive and YouTube videos.

The most effective intervention was the introduction of a formative support session (1.5hr). Students practiced using their OneDrive area, set up YouTube accounts and reviewed examples of screencasts. This workshop supported students to understand the professional skills that could be developed through this medium. The session introduced the assessment requirements, toolkit, digital coversheet and allowed students to explore the technologies in a supported manner (improving students’ assessment literacy!)

The assignment instructions were strategically revised, to include information (‘hints and tips’) to support the students’ development of higher production values and other associated digital literacies for the workplace (such as file naming conventions, digital workflows, and sourcing online services).

Students were provided with the option to self-select recording/editing software to undertake the screencast video. Recommended tools were suggested, that are free to use and which students could explore. ‘Screencast-o-matic’ and ‘WeVideo’ provide basic to intermediate options.

Impact

Marking the submissions was made easier by the ability to access videos through a consistent format, using a clearly structured submission process (digital coversheet). The ability to play URL links directly through OneDrive meant Rachel was able to store copies of the videos into a central area for future reference. Students also provided a written summary of their video, highlighting key video timings that demonstrate marking criteria (so the marker does not have to watch whole video).

Rachel rationalised her approach to marking by developing a spreadsheet, which allowed her to effectively cross reference feedback against the assessment criteria (in the form of a rubric) and between assignments. This greatly speeded up the marking workflow and allowed Rachel to identify patterns in students work, where common feedback statements could be applied, as appropriate.

The assessment highlighted gaps in students existing digital literacies. The majority of students had not made a video recording before and many were apprehensive about speaking into a microphone. After the completion of the screencasts, previously unconfident students noted in their module reflections that the screencast task had developed their confidence to communicate and explore a new technology.

Reflections

The modifications to the assessment:

  • Reflected professional digital competencies required of the discipline;
  • Allowed students to explore a new technology and way of working in a supported context; and,
  • Built confidence, facilitated assessment literacy, and encouraged reflection.

Future modifications to the screencast submission:

  • Peer review could be implemented, asking students to upload videos to a shared space for formative feedback (such as Facebook or a Blackboard discussion board).
  • The digital coversheet had to be downloaded to access URL links. In future, students could paste into the submission comment field, for easier access when marking.
  • Rachel is developing a self-assessment checklist to help students reflect on the production values of their work. The summative assessment rubric is focused on video content, not production values, however, it would be useful for students to get feedback on professional work skills. For example, communication skills and use of narrative devices which translate across other graphic mediums.

Toolkit basics:

a thumbnail image of a toolkit document, full access available via links in webpage

  • Outline task expectations and software options, give recommendations
  • Source examples of screencasts from your industry, discuss with students.
  • Provide hints and tips for creating effective screencasts.
  • Provide submission text. Consider asking students to use the ‘submission comment’ field to paste links to their work, for quick marker access to URLs.
  • Plan a formative workshop session, to practice using the software and go through the submission process (time invested here is key!).
  • Create a self-assessment checklist, to enhance the production quality of videos and highlight transferrable skills that can be developed by focusing on the quality of the production.
  • Consider creating a shared online space for formative peer-feedback (e.g. Blackboard discussion forum).
  • Consider using a marking spreadsheet to cross-reference feedback and highlight good examples of screencasts that can be utilised in other teaching.

Links

Screencast example: (YouTube link) This screencast was altered and improved after submission and marking, taking onboard feedback from the assessment and module. The student noted ‘After submission, I reflected on my screencast, and I changed the original image because it was too complex to fit into the short time that I had available in the screencast. I wanted to use the screencast to show a skill that I had learned and the flower was simple enough to showcase this’. Part of the module was to be reflective and learn from ‘doing’, this screencast is an example of a student reflecting on their work and improving their skills after the module had finished.

Screencast example: (YouTube link) This screencast was a clear and comprehensive demonstration of a technique in PhotoShop that requires multiple elements to achieve results. It has a conclusion that demonstrates the student’s awareness that the technique is useful in other scenarios, other than the one demonstrated, giving the listener encouragement to continue learning. The student has used an intro slide and background music, demonstrating exploration with the screencast software alongside compiling their demonstration.

Screencast example: (YouTube link) This demonstrates a student who is competent in a tool, able to use their own work (work from another module on the course) to demonstrate a task, and additionally includes their research into how the tool can be used for other tasks.

Other screencast activity from the Typography & Graphic Communication department from the GRASS project:  (Blog post) Previous project for Part 1s that included use of screencasts to demonstrate students’ achievements of learning outcomes.

Celebrating Student Success Through Staff-Student Publication Projects

Dr Madeleine Davies, Department of English Literature

m.k.davies@reading.ac.uk

Overview

In 2017 I replaced the exam on a Part 3 module I convene (‘Margaret Atwood’) with an online learning journal assessment and I was so impressed with the students’ work that I sought funding to publish selected extracts in a UoR book, Second Sight: The Margaret Atwood Learning Journals. The project has involved collaboration between the Department of English Literature and the Department of Typography & Graphic Communication, and it has confirmed the value of staff-student partnerships, particularly in relation to celebrating student attainment and enhancing graduate employability.

Objectives

  • To showcase the achievements of our Part 3 students before they graduate and to memorialise their hard work, engagement and ingenuity in material form
  • To demonstrate at Open Days and Visit Days the quality of teaching and learning in the Department of English Literature in order to support student recruitment
  • To create a resource for students enrolling on the module in future years
  • To encourage reflection and conversation in my School regarding the value of diversified assessment practice

Context

The ‘Margaret Atwood’ module has always been assessed through an exam and a summative essay but I was dissatisfied with the work the exam produced (I knew that my students could perform better) so I researched alternative assessment formats. In 2017 I replaced the exam with a Blackboard learning journal because my research suggested that it offered the potential to release students’ creative criticality. I preserved the other half of the assessment model, the formal summative essay, because the module also needed an assessment where polished critical reading would be rewarded. With both assessment elements in place, students would need to demonstrate flexible writing skills and adapt to different writing environments (essential graduate skills). A manifest benefit of journal assessment is that it offers students to whom essay-writing does not come easily an opportunity to demonstrate their true ability and engagement so the decision to diversify assessment connected with inclusive practice.

Implementation

I decided to publish the students’ writing in a UoR book because I did not want to lose their hard work to a digital black-hole: it deserved a wider audience. I sought funding from our Teaching and Learning Deans, who supported the project from the beginning, and I connected with the ‘Real Jobs’ scheme in the Department of Typography & Graphic Communication where students gain valuable professional experience by managing funded publishing commissions for university staff and external clients. This put me in contact with a highly skilled student typographer with an exceptional eye for design. I asked a member of the ‘Margaret Atwood’ group to help me edit the book because I knew that she wanted to pursue a career in publishing and this project would provide invaluable material for her CV. Together we produced a ‘permissions’ form for students to formally indicate that they were releasing their work to the publication and 27 out of 36 students who were enrolled on the Spring Term module responded; all warmly welcomed the initiative. Contributors were asked to submit Word files containing their entries so as to preserve the confidentiality of their online submissions; this was important because the editors and designers were fellow students. Throughout the Summer Term 2018, the students and I met and planned, designed and edited, and the result is a book of which we are proud. With the sole exception of the Introduction which I wrote, every element of it, from the cover image to the design to the contents, is the work of our students.

Impact

The impact of the project will be registered in terms of Open Days because Second Sightwill help demonstrate the range of staff-student academic and employability activities in DEL. In addition, the project has consolidated connections between DEL and the Department of Typography & Graphic Communication and we will build on this relationship in the next session.

A further impact, which cannot be evidenced easily, is that it provides a useful resource for our graduates’ job applications and interviews: students entering publishing or journalism, for example, will be able to speak to their participation in the project and to their work in the book. The collection showcases some excellent writing and artwork and DEL graduates can attend interviews with tangible evidence of their achievements and abilities.

Reflections

Producing this book with such talented editors, designers and contributors has been a joy: like the ‘Margaret Atwood’ module itself, Second Sight confirms the pleasures and the rewards of working in partnership with our students.

The project sharpened my own editing skills and created a space to share knowledge about publishing conventions with the students who were assisting me. We all learned a great deal from each other: June Lin, the Typography student designer, gave me and the student editor (Bethany Barnett-Sanders) insights into the techniques of type-setting and page layout. To reciprocate, Bethany and I enhanced June’s knowledge of Margaret Atwood’s work which she had read but never studied. This pooling of knowledge worked to the benefit of us all.

One of the advantages of the learning journal was that it allowed me a clear view of the inventiveness and ingenuity that students bring to their work, and my sense of appreciation for their skill was further enhanced by working with students on the book. Technically, this was less of a ‘staff-student’ collaboration than it was a mutual education between several people.

The process we followed for acquiring written permission from students to include their work in the book, and for gathering Word files to avoid confidentiality issues, was smooth, quick, and could not have been improved. The only difficulty was finding time to edit seventy-five contributions to the book in an already busy term. Whilst this was not easy, the results of the collaboration have made it well and truly worth it.

Follow up

It is too early to tell whether other DEL colleagues will choose to diversify their own assessments and pursue a publishing project similar to the ‘Margaret Atwood’ example if they do. There is, however, a growing need for Open Day materials and Second Sight joins the Department’s Creative Writing Anthology to demonstrate that academic modules contain within them the potential for publication and collaborative initiative. I will certainly be looking to produce more publications of this nature on my other learning journal modules in the next session; in the meantime, copies of Second Sight will be taken with me to the outreach events I’m attending in July in order to demonstrate our commitment to student engagement, experience and employability here at the University of Reading.

Related entries

http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-using-focus-groups-to-diversify-assessment/

http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-using-focus-groups-to-diversify-assessment-part-2/

http://blogs.reading.ac.uk/t-and-l-exchange/connecting-with-the-curriculum-framework-in-student-participation-at-academic-conferences/

 

Engaging students in assessment design

Dr Maria Kambouri-Danos, Institute of Education

m.kambouridanos@reading.ac.uk

Year of activity 2016/17

Overview

This entry aims to share the experience of re-designing and evaluating assessment in collaboration with students. It explains the need for developing the new assessment design and then discusses the process of implementing and evaluating its appropriateness. It finally reflects on the impact of MCQ tests, when assessing students in higher education (HE), and the importance of engaging students as partners in the development of new assessment tools.

Objectives

  • To re-design assessment and remove a high-stakes assessment element.
  • To proactively engage ‘students as partners’ in the development and evaluation of the new assessment tool.
  • To identify the appropriateness of the new design and its impact on both students and staff.

Context

Child Development (ED3FCD) is the core module for the BA in Children’s Development and Learning (BACDL), meaning that a pass grade must be achieved on the first submission to gain a BA Honours degree classification (failing leads to an ordinary degree). The assessment needed to be redesigned as it put the total weight of students’ mark on one essay. As the programme director, I wanted to engage the students in the re-design process and evaluate the impact of the new design on both students and staff.

Implementation

After attending a session on ‘Effective Feedback: Ensuring Assessment and Feedback works for both Students and Staff Across a Programme’ I decided to explore more the idea of using Multiple Choice Tests (MCQ). To do so, I attended a session on ‘Team Based Learning (TBL)’ and another on ‘MCQ: More than just a Test of Information Recall’, to gather targeted knowledge about designing effective MCQ questions.

I realised that MCQ tests can help access students’ understanding and knowledge and also stimulate students’ active and self-managed learning. Guided by the idea of ‘assessment for learning’, I proposed the use of an MCQ test during a steering group meeting (employees and alumni) and a Board of Studies (BoS) meeting, which 2nd year Foundation Degree as well as BACDL student representatives attended. The idea was resisted initially, as MCQ tests are not traditionally used in HE education departments. However, after exploring different options and highlighting the advantages of MCQ tests, the agreement was unanimous. At the last BoS meeting (2016), students and staff finalised the proposal for the new design, proposing to use the MCQ test for 20% of the overall mark, keeping the essay for the remaining 80%.

At the beginning of 2017, I invited all BACDL students to anonymously post their thoughts and concerns about the new design (and the MCQ test) on Padlet. Based on these comments, I then worked closely with the programme’s student representatives and had regular meetings to discuss, plan and finalise the assessment design. We decided how to calculate the final mark (as the test was completed individually and then in a group) as well as the total number of questions, the duration of the test, etc.  A pilot study was then conducted during which a sample MCQ test was shared with all the students, asking them to practise and then provide feedback. This helped to decide the style of the questions used for the final test, an example of which is given below:

There are now more than one million learners in UK schools who speak English as an additional language (EAL). This represents a considerable proportion of the school population, well above 15 per cent. To help EAL children develop their English, teachers should do all the following, except…

a. use more pictures and photographs to help children make sense of new information.

b. use drama and role play to make learning memorable and encourage empathy.

c. maintain and develop the child’s first language alongside improving their English.

d. get children to work individually because getting them into groups will confuse them and make them feel bad for not understanding.

e. provide opportunities to talk before writing and use drills to help children memorise new language.

Impact

Students were highly engaged in the process of developing the new design, and the staff-student collaboration encouraged the development of bonds within the group. The students were excited with the opportunity to actively develop their own course and the experience empowered them to take ownership of their own learning. All of them agreed that they felt important and as a student representative said, “their voices were heard”.

The new design encouraged students to take the time to gauge what they already know and identify their strengths and weaknesses. Students themselves noted that the MCQ test helped them to develop their learning as it was an additional study opportunity. One of them commented that “…writing notes was a good preparation for the exam. The examination was a good learning experience.” Staff also agreed that the test enabled students to (re)evaluate their own performance and enhance their learning. One of the team members noted that the “…test was highly appropriate for the module as it offered an opportunity for students to demonstrate their proficiency against all of the learning outcomes”.

Reflections

The new assessment design was implemented successfully because listening to the students’ voice and responding to their feedback was an essential part of the designing process. Providing opportunities to both students and staff to offer their views and opinions and clearly recognising and responding to their needs were essential, as these measures empowered them and helped them to take ownership of their learning.

The BACDL experience suggests that MCQ tests can be adapted and used for different subject areas as well as to measure a great variety of educational objectives. Their flexibility means that they can be used for different levels of study or learning outcomes, from simple recall of knowledge to more complex levels, such as the student’s ability to analyse phenomena or apply principles to new situations.

However, good MCQ tests take time to develop. It is hoped that next year the process of developing the test will be less time-consuming as we already have a bank of questions that we could use. This will enable randomisation of questions which will also help to avoid misconduct. We are also investigating options that would allow for the test to be administered online, meaning that feedback could be offered immediately, reducing even further the time/effort required to mark the test.

Follow up

MCQ tests are not a panacea; just like any other type of assessment tool, MCQ tests have advantages and limitations. This project has confirmed that MCQ tests are adaptable and can be used for different subject areas as well as to measure a great variety of educational objectives. The evaluation of the assessment design will continue next year and further feedback will be collected by the cohort and next year’s student representatives.