Catherine Foley is a lecturer in Primary Maths Education in the Institute of Education. She is Director of the Primary School Direct programme which trains people to be teachers whilst they are working in schools.

Image of Catherine Foley

OBJECTIVES

Catherine describes her experience of using the Feedback Studio to move from Word-based marking an assignment to full use of Grademark.

CONTEXT

Catherine Foley is a lecturer in Primary Maths Education in the Institute of Education. She is Director of the Primary School Direct programme which trains people to be teachers whilst they are working in schools. Her experience of electronic marking relates primarily to a 20 credit postgraduate module which is part of this programme, developing the reflective practice and critical thinking of the trainees. The module is assessed through one piece of written work which is assessed formatively and summatively and is taken by approximately 80 students each year.

IMPLEMENTATION

Up until the current academic year, although students would submit their work through Turnitin (for both formative and summative attempts), they would receive feedback in the form of underlined grading sheets and text-based comments which would be completed for each student and uploaded to be released to them via Grade Centre. As with other IoE programmes, all submission, grading and feedback for this assessment is now carried out electronically.

This year, we decided to use the full electronic feedback option for both assessments since the first formative experience would give students (and staff) the chance to get used to the system. We
developed our own rubric for the assessment. For the formative assessment, we decided not to use quickmarks but just to focus on becoming familiar with using the rubric. For the summative
assessment, both a rubric and quickmarks were used: the quickmark set is the same as that used for other initial teacher training programmes.

In my own marking, I found it helpful, when getting started, to open out the full rubric in a grid from the sidebar in the feedback studio. After a while, I was clear what the different statements meant and so could use the sliders more confidently.

IMPACT

  • Speed of marking. Although marking has not been any quicker so far overall, it is likely that this will speed up as the administrative problems are ironed out and we get to know the
    system. Not having to save individual files saves a lot of time which can be spent on quality feedback.
  • Ease of moderation. Because all the assessment and feedback is in the same place, it is much more straightforward and a module convenor is easily able to quality-assure the marking
    that is taking place.
  • Curriculum review opportunity. Developing our own rubric for the assessment encouraged us to review what we had been doing. It made use stop and examine our taken-for-granted practice.
  • Student ownership of feedback. We had a workshop on developing academic writing and it was interesting to see all the students with their laptops open, looking at very specific
    pieces of contextualised feedback received online for their first assignment.
  • Using rubric reports for bespoke study advice sessions. We used the function in Turnitin to generate a report on how well students had achieved as a cohort in relation to the different
    rubric themes. We sent the report to one of the study advisers who was then able to use this to pinpoint areas to focus upon in helping students work towards their next assignment.

REFLECTIONS

Many of the challenges we experienced were due to the fact that the assessment is marked by five different members of staff:

  • When we were using Word-based documents for feedback, we could shape and guide the feedback which tutors were giving more easily (for example with a writing frame). In the feedback studio, the text comment box presents markers with a blank space so it has been harder to ensure a common approach across markers. We therefore agreed a common structure for feedback in this box.
  • The marking team had differing levels of experience with electronic marking. Because the quickmark set had to be uploaded by each marker to their Blackboard account and not all markers were present on campus at the same time, this was a logistical challenge.
  • With the options for quickmarks, rubric statements and open text comments, it would be easy for markers to over-assess each piece of work. Our agreement was that, since students were getting extra feedback in terms of the first two kinds of feedback, the final text comments should be brief and simply recognise specific areas of success then pinpoint areas for
    development.
  • Limitations in functionality of the feedback studio. Some markers liked to be able to use Word to check the number of times a student has used a particular phrase or look at the
    consistency between citations and references: you can’t currently move around the document so easily (unless you download it). Some warning or confirmation messages from
    the system (for example when moving onto the next piece of work) would make it still more user-friendly. With several people involved in marking an assignment, it is easy for markers
    to accidentally change each other’s grades – it would be helpful if grades and comments could be ‘locked’ in some way. Are different levels of access possible, so that external examiners can see the feedback studio but without being able to change feedback?
  • There are still issues (mostly to do with administrative protocols) to iron out. The IoE is currently reviewing its moderation processes and determining the extent to which
    students know they have been included. Programme directors are working with their admin teams to determine exactly how
    academics will be informed when an ECF assignment has been submitted.