Making full use of grademark in geography and environmental science – Professor Andrew Wade

 

Profile picture for Prof. Andrew Wade

Professor Andrew Wade is responsible for research in hydrology, focused on water pollution, and Undergraduate and Postgraduate Teaching, including Hydrological Processes

OBJECTIVES

Colleagues within the School of Archaeology, Geography and Environmental Sciences (SAGES) have been aware of the University’s broader ambition to move towards online submission, feedback and grading where possible. Many had already made the change from paper based to online practices and others felt that they would like the opportunity to explore new ways of providing marks and feedback to see if handling the process online led to a better experience for both staff and students.

CONTEXT

In Summer 2017 it was agreed that SAGES would become one of the Early Adopter Schools working with the EMA Programme. This meant that the e Submission, Feedback and Grading work stream within the Programme worked very closely with both academic and professional colleagues within the School from June 2017 onwards. This was in order to support all aspects of a change from offline to online marking and broader processes for all coursework except where there was a clear practical reason not to, for example, field note-books.
I had started marking online in 2016-2017 so was familiar with some aspects of marking tools and some of the broader processes.

IMPLEMENTATION

My Part 2 module, GV2HY Hydrological Processes, involves students producing a report containing two sections. Part A focuses on a series of short answers based on practical-class experiences and Part B requires students to write a short essay. I was keen to use all of the functionality of Grademark/Turnitin during the marking process so I spent time creating my own personalised QuickMark bank so that I could simply pull across commonly used feedback phrases and marks against each specific question. This function was particularly useful to use when marking Part A. I could pull across QuickMarks showing the mark and then, in the same comment, explain why the question received, for example, 2 out of a possible 4 marks. It was especially helpful that my School sent around a discipline specific set of QuickMarks created by a colleagues. We could then pull the whole set or just particular QuickMarks into our own personalised set if we wanted to. This reduced the time spend on personalising and meant that the quality of my own set was improved further.

I also wanted to explore the usefulness of rubric grids as one way to provide feedback on the essay content in Part B of the assignment. A discipline specific example rubric grid was created by the School and send around to colleagues as a starting point. We could then amend this rubric to fit our specific assessment or, more generally, our modules and programmes. The personalised rubrics were attached to assignments using a simple process led by administrative colleagues. When marking I would highlight the level of performance achieved by each student, against each criteria by simply highlighting the box in blue. This rubric grid was used alongside both QuickMarks and in text comments in the essay. More specific comments were given in the blank free text box to the right of the screen.

IMPACT

Unfortunately module evaluation questionnaires were distributed and completed before students received feedback on their assignments so the student reaction to online feedback using QuickMarks, in text comments, free text comments and rubrics was not captured.

In terms of the impact on the marker experience, after spending some initial time getting my personal Quickmarks library right and amending the rubric example to fit with my module, I found marking online easier and quicker than marking on paper.

In addition to this, I also found that the use of rubrics helped to ensure standardisation. I felt comfortable that my students were receiving similar amounts of feedback and that this feedback was consistent across the cohort and when returning to marking the coursework after a break. When moderating coursework, I tend to find more consistent marking when colleagues have used a rubric.
I also felt that students received more feedback than they usually might but am conscious of the risk that they that drown in the detail. I try to use the free text boxes to provide a useful overall summary to avoid overuse of QuickMarks.

I don’t worry now about carrying large amounts of paper around or securing the work when I take assignments home. I also don’t need to worry about whether the work I’m marking has been submitted after the deadline – under the new processes established in SAGES, Support Centre colleagues deduct marks for late submission.

I do tend to provide my cohorts with a short piece of generic feedback, including an indicator of how the group performed-showing the percentage of students who had attained a mark in each class. I could easily access this information from Grademark/Turnitin.

I’m also still able to work through the feedback received by my Personal Tutees. I arrange individual sessions with them, they access ‘My Grades’ on Blackboard during this meeting and we work through the feedback together.

One issue was that, because the setting were set up in a particular way, students could access their feedback as soon as we had finished writing it. This issue was identified quickly and the settings were changed.

REFLECTIONS

My use of online marking has been successful and straightforward but my experience has been helped very significantly by the availability of two screens in my office. These had already been provided by School but became absolutely essential. Although I largely mark in my office on campus, when I mark from home I set up two laptops next to each other to replicate having two screens. This set up allows me to be able to check the student’s work on one screen whilst keeping their coursework on the other.

One further area of note is that the process of actually creating a rubric prompted a degree of reflection over what we actually want to see from students against each criteria and at different levels. This was particularly true around the grade classification boundaries-what is the different between a high 2:2 and a low 2:1 in terms of each of the criteria we mark against and how can we describe these differences in the descriptor boxes in a rubric grid so that students can understand.

This process of trying to make full use of all of the functions within our marking tools has led to some reflection surrounding criteria, what we want to see and how we might describe this to students.

LINKS

For more information on the creation and use of rubrics within Grademark/Turnitin please see the Technology Enhanced Learning Blog pages here:
http://blogs.reading.ac.uk/tel/support-blackboard/blackboard-support- staff-assessment/blackboard-support-staff-turnitin/turnitin-rubrics/

Using Quickmarks to enhance essay feedback in the department of English Literature – Dr Mary Morrissey

Within the department, I teach primarily in Early Modern and Old English. For more details of my teaching please see Mary Morrissey Teaching and Convening

My primary research subject is Reformation literature, particularly from London. I am particularly interested in Paul’s Cross, the most important public pulpit in sixteenth and seventeenth-century England. I retain an interested in early modern women writers, with a particular focus on women writers’ use of theological arguments. Further details of my research activities can be found at Mary Morrissey Research

OBJECTIVES

A number of modules within the Department of English Literature began using GradeMark as a new marking tool in the Autumn of 2015. I wanted to explore the use of the new QuickMarks function as a way of enhancing the quality of the feedback provided to our students and ensuring the ‘feedback loop’ from general advice on essay writing to the feedback on particular pieces of assessed work was completed.

CONTEXT

The Department developed extensive guidance on writing skills to support student assessment: this includes advice on structuring an argument as well as guidance on grammar and citations. This guide was housed on departmental handbooks and in the assignments folder in Blackboard. There was considerable concern that this resource was underused by students. We did know that the QuickMarks function was being used as part of our online feedback provision and that it was possible to personalise the comments we were using and to add links to those comments as a way of providing additional explanation to students.

IMPLEMENTATION

In order to allow relevant sections of the essay writing style guide to be accessed via QuickMarks I copied the document into a Google Doc, divided each section by using Google Doc bookmarks and assigned each bookmark an individual URL link. I then used Bitly.com to shorten the URL link assigned to each section by the Google Doc to make it more useable. I then created a set of Quickmarks that included these links to the Style Guide. In this way, students had direct access to the relevant section of the Guide while reading their feedback. So if a student hadn’t adopted the correct referencing format (the Modern Humanities Research Association style in the case of English Literature) the marker would pull a QuickMark across to the relevant point of the essay. When the student hovered over this comment bubble, they would see the text within it but were also able to click on the URL taking them directly to page 7 of the departmental writing style guide on MHRA citation and referencing. If other colleagues wanted to start adopting the same approach, I simply exported the QuickMark set to them which they incorporated into their own QuickMarks bank within seconds.

IMPACT

The Bitly.com tool, used to shorten the URL link, monitored the usage of each link included in our QuickMarks. This showed us how many times and on which date each individual link was used.

To complement this data I also ran a survey on the student response to online marking and feedback. 35 undergraduate students responded. This showed that students found feedback most useful when it came in forms that were familiar from paper marking, like general comments on the essay and marginal comments throughout the essay. Less familiar types of feedback (links to web-resources included in bubble comments accessed by hovering the cursor) were often missed. In the survey, 28 out of 35 students said that they did not receive any links to the writing style guide within their QuickMark comments even though more than this did receive them. 3 students did not click on the links. Of the 5 remaining students who did make use of the links, 3 responded positively, mentioning their value in terms of improving their writing skills:

“It was good to refer to alongside my work”
“They helped me to strengthen my writing overall”
“Yes motivational to actually look at them-whereas on a paper copy you might read he comment and forget but here you can click straight through so much easier!”

REFLECTIONS

Some of the new functions available to us on GradeMark allow us to improve our feedback. We shouldn’t just be using online marking tools to replicate existing off line marking processes. We can go much further! But if this is going to be successful it is really important to inform students about the range of options that online marking makes available so that they make the most of the systems we use.

Once we do this effectively, we can then explore other options. In English Literature, we are keen to ensure that our Department style guide is used effectively. But there are many other web resources to which we could link through Quickmarks: screencast essay writing guides in Politics and IWLP, as well as the new Academic Integrity toolkit by Study Advice, for example.

By including links within QuickMark comments we help to move students towards greater levels of assessment literacy.

LINKS

Academic Integrity Toolkit

http://libguides.reading.ac.uk/academicintegrity

Examples of assessment support screencasts created by colleagues

Screencast bank

Study Support Screencast Suite

https://www.reading.ac.uk/library/study-advice/guides/lib-sa- videos.aspx

Bitly URL shortener and link management platform

https://bitly.com/

ELECTRONIC FEEDBACK AND GRADING METHODS – Dr Geoff Taggart

Profile picture for Dr Taggart

Dr Geoff Taggart is a lecturer in the Institute of Education and Programme Director for the Early Years Practice programme at Reading. As part of his secondment to the EMA programme, Geoff decided to run a focus group with students from the IoE to gather perspectives on electronic feedback and grading methods.

OBJECTIVES

To identify student views on:

• The perceived benefits of the three forms of most commonly- used feedback offered by Grademark (i.e. Quickmarks, rubrics and text comments)

• Preferences regarding the emphasis which each form of feedback should be given in a typical piece of work

• Views regarding the interrelationship of the different forms of feedback

CONTEXT

The focus group was composed of 4 MA students (2 international and 2 home), plus one Chinese academic visitor with recent experience of being a student. Their views were therefore representative of students engaged in social science disciplines and may not be transferable to other fields. Also in attendance were myself, Dr Maria Kambouri (engagement in feedback project) and Jack Lambert-Taylor (EMA). It took place at London Road campus between 5 and 6.30pm on Thurs 18th January.

IMPLEMENTATION

I provided participants with three copies of the same assignment, one marked exclusively with Quickmarks, one marked only with the final text comment and one marked solely according to the rubric. The purpose of this was to isolate and focus attention upon each of the three kinds of electronic feedback provided through the Feedback Studio.

The marking was not meant to be typical (nor as examples of best practice) but to highlight the positive and negative qualities of each kind of feedback. For example, there were a lot more quickmark comments appended to the assignment than would usually occur. The purpose of this was to emphasise both the positive benefits of maximised contextualised feedback and the negative impression of ‘overload’ which the comments could give. Additionally, the text comments amounted to over 2500 words and were extremely conversational and wide-ranging.

In a similar way, whilst this strategy deliberately emphasised the dialogical and personal nature of this feedback method, it was also not easy to straightforwardly pick out those points where the student needed to improve. By contrast, the rubric does this very clearly but is not a personal way of providing feedback.

REFLECTIONS

Quickmark feedback

• Students appreciated Quickmarks which contained hyperlinks (e.g. to Study Advice)

• One participant noted that they didn’t like the Quickmarks, on the basis that when printed the document does not have interactive links. The same participant suggested that excessive Quickmarks may be intrusive, and give the impression of ‘massacring’ a student’s work. They agreed that less excessive use would be preferable. The same participant noted that there was ‘no positive’ or ‘constructive’ feedback on the page- only problem points. This may be due to the nature of the sample work, which was deliberately of a poor standard; perhaps the same study should be conducted with a high quality piece of work.

• Another participant noted that narrative summaries can come across as more personal, particularly if negative, and that they preferred Quickmarks on the basis that they provided a more objective tone. Another participant suggested that Quickmarks may come across as more ‘humane’ on that basis, rather than a ‘rant at the end’.

• Another participant suggested that Quickmarks provide good evidence of the thoroughness of the marking process.

• One participant suggested that Quickmarks could indicate to which assessment criteria in the rubric it refers. The facility to do this was explained

• It was noted that Quickmarks should be written passively rather that directed at the author, as it can appear more accusatory. For example, ‘The point is not clear here’ as opposed to ‘you have not been clear here’.

Summary – Quickmarks should be limited in their use, include positive as well as negative comments, include relevant hyperlinks and be focussed on the assignment rather than the student and associated with rubric criteria where possible.

Text comments

• Two participants suggested that narrative summary can provide more detailed feedback and valued the conversational tone. It was also suggested that Quickmarks may be perceived as momentary thoughts without reflection, whilst narrative summary may come later after further thought.

• One participant noted that when you write an essay you aren’t ‘just trying to tick boxes in a rubric, you are trying to say something’. This was a really interesting point which emphasised the student expectation of a personal, dialogical relationship with their tutor (something which rich text comments support).

• Several participants noted that marking with more narrative summary would be more time-consuming, and expressed empathy for academics doing so.

• It was also noted that narrative summary would be better-fitted to a conversation in person, and that subtleties within the feedback would be better expressed through intonation in the voice and facial expressions of the marker. Absent those features, it can come across as very serious, and lacks intricacy.

• Students commented that this kind of feedback can also become too ‘waffly’ and lack focus.

Summary – This kind of feedback gives the strongest impression that the tutor has considered the assignment overall, mulled it over and arrived at a holistic impression, something that was highly valued (contrast with: ‘a marked rubric alone shows that the tutor perhaps didn’t think about it that much’). However, the writing needs to be clearly focussed on specific ways in which the student can improve (i.e. bullet points).

Rubric

• Students commented positively that the rubric showed very clearly how successful an assignment had been in general terms. However, they were concerned that it does not explain how to improve if you have not done very well.

• Students questioned how the final mark is actually calculated through the use of a qualitative rubric where the different elements are unweighted – this was considered to lack full transparency.

• It was unanimously agreed that a rubric without comments was not a preferable form of feedback on its own due to lacking feed-forward information, despite the fact that the adjacent rubric statements (i.e. in the next grade band up) also appear to students in the feedback.

• Students did not like the way in which the rubric statements were represented in a consecutive list (see below) when printed off. They much preferred the grid they were used to (i.e. with grade boundaries as the columns and rubric criteria as the rows).

Summary – a rubric is useful in showing how successful an assignment has been in a broad and general sense. The only way in which it could be more useful would be if the rubric were more specific to this particular assignment (and so have multiple rubrics across programmes/the School)

CONCLUSIONS

1. All forms of feedback, taken together, were considered to be useful.

2. The three different forms of feedback need to support each other (e.g. the rubric needs to reflect the written comments, tutors could use the same language in their text comments as that used in the rubric statements)

3. No matter the means by which feedback is given, students want to feel as though their work has made an impression on their tutor.

4. If tutors want to mark mostly through Quickmarks and rubrics (and provide greatly reduced written comments), this may be perceived negatively by students who expect a more personalised response.

FOLLOW UP

The following points may require consultation from Blackboard:

• One participant suggested that different colours may be used to indicate whether quickmark feedback is positive or negative.

• A tutor suggested that it would be helpful if tutors could have flexibility about where to position their Quickmarks in their set, otherwise they just appear rather randomly. This is an issue when marking at speed. )

• All participants suggested that they like the use of ticks in marking, but no alternative was suggested. Can a tick symbol be included in the quickmark set?

• Tutors are able to expand the rubric when marking. Can it be presented to students in this format?

LINKS

Quickmarks:

https://guides.turnitin.com/01_Manuals_and_Guides/Instructor_Guides/Turnitin_Classic_(Deprecated)/25_GradeMark/QuickMark

Rubrics:

https://guides.turnitin.com/01_Manuals_and_Guides/Instructor_Guides/Turnitin_Classic_(Deprecated)/25_GradeMark/Rubrics_and_Grading_F orms

Text comments:

https://guides.turnitin.com/01_Manuals_and_Guides/Instructor_Guides/Feedback_Studio/Commenting_Tools/Text_summary_comments

Using quickmarks and rubrics in online assessment – Catherine Foley

Catherine Foley is a lecturer in Primary Maths Education in the Institute of Education. She is Director of the Primary School Direct programme which trains people to be teachers whilst they are working in schools.

Image of Catherine Foley

OBJECTIVES

Catherine describes her experience of using the Feedback Studio to move from Word-based marking an assignment to full use of Grademark.

CONTEXT

Catherine Foley is a lecturer in Primary Maths Education in the Institute of Education. She is Director of the Primary School Direct programme which trains people to be teachers whilst they are working in schools. Her experience of electronic marking relates primarily to a 20 credit postgraduate module which is part of this programme, developing the reflective practice and critical thinking of the trainees. The module is assessed through one piece of written work which is assessed formatively and summatively and is taken by approximately 80 students each year.

IMPLEMENTATION

Up until the current academic year, although students would submit their work through Turnitin (for both formative and summative attempts), they would receive feedback in the form of underlined grading sheets and text-based comments which would be completed for each student and uploaded to be released to them via Grade Centre. As with other IoE programmes, all submission, grading and feedback for this assessment is now carried out electronically.

This year, we decided to use the full electronic feedback option for both assessments since the first formative experience would give students (and staff) the chance to get used to the system. We
developed our own rubric for the assessment. For the formative assessment, we decided not to use quickmarks but just to focus on becoming familiar with using the rubric. For the summative
assessment, both a rubric and quickmarks were used: the quickmark set is the same as that used for other initial teacher training programmes.

In my own marking, I found it helpful, when getting started, to open out the full rubric in a grid from the sidebar in the feedback studio. After a while, I was clear what the different statements meant and so could use the sliders more confidently.

IMPACT

  • Speed of marking. Although marking has not been any quicker so far overall, it is likely that this will speed up as the administrative problems are ironed out and we get to know the
    system. Not having to save individual files saves a lot of time which can be spent on quality feedback.
  • Ease of moderation. Because all the assessment and feedback is in the same place, it is much more straightforward and a module convenor is easily able to quality-assure the marking
    that is taking place.
  • Curriculum review opportunity. Developing our own rubric for the assessment encouraged us to review what we had been doing. It made use stop and examine our taken-for-granted practice.
  • Student ownership of feedback. We had a workshop on developing academic writing and it was interesting to see all the students with their laptops open, looking at very specific
    pieces of contextualised feedback received online for their first assignment.
  • Using rubric reports for bespoke study advice sessions. We used the function in Turnitin to generate a report on how well students had achieved as a cohort in relation to the different
    rubric themes. We sent the report to one of the study advisers who was then able to use this to pinpoint areas to focus upon in helping students work towards their next assignment.

REFLECTIONS

Many of the challenges we experienced were due to the fact that the assessment is marked by five different members of staff:

  • When we were using Word-based documents for feedback, we could shape and guide the feedback which tutors were giving more easily (for example with a writing frame). In the feedback studio, the text comment box presents markers with a blank space so it has been harder to ensure a common approach across markers. We therefore agreed a common structure for feedback in this box.
  • The marking team had differing levels of experience with electronic marking. Because the quickmark set had to be uploaded by each marker to their Blackboard account and not all markers were present on campus at the same time, this was a logistical challenge.
  • With the options for quickmarks, rubric statements and open text comments, it would be easy for markers to over-assess each piece of work. Our agreement was that, since students were getting extra feedback in terms of the first two kinds of feedback, the final text comments should be brief and simply recognise specific areas of success then pinpoint areas for
    development.
  • Limitations in functionality of the feedback studio. Some markers liked to be able to use Word to check the number of times a student has used a particular phrase or look at the
    consistency between citations and references: you can’t currently move around the document so easily (unless you download it). Some warning or confirmation messages from
    the system (for example when moving onto the next piece of work) would make it still more user-friendly. With several people involved in marking an assignment, it is easy for markers
    to accidentally change each other’s grades – it would be helpful if grades and comments could be ‘locked’ in some way. Are different levels of access possible, so that external examiners can see the feedback studio but without being able to change feedback?
  • There are still issues (mostly to do with administrative protocols) to iron out. The IoE is currently reviewing its moderation processes and determining the extent to which
    students know they have been included. Programme directors are working with their admin teams to determine exactly how
    academics will be informed when an ECF assignment has been submitted.