Feedback via audiofiles in the Department of English Literature – Professor Cindy Becker

Profile picture for Prof. Becker

Cindy Becker is the Director of Teaching and Learning for the School of Literature and Languages and also teaches in the Department of English Literature. She is a Senior Fellow of the Higher Education Academy and has been awarded a University of Reading Teaching Fellowship. She is an enthusiastic member of several University Communities of Practice: Placement Tutors, University Teaching Fellows, Technology Enhanced Learning, and Student Engagement Champions.

Cindy is a member of Senate and has sat on university steering committees and working parties; she is also a member of the Management Committee for the School of Literature and Languages and chair the School Board for Teaching and Learning. She is the convenor of Packaging Literature and Shakespeare on Film.

In September 2015 she started to trial the use of the audio feedback function within Turnitin’s online marking tool (GradeMark). This innovative approach did present some initial challenges but, overall, it proved to be a great success for both Cindy and her students.

OBJECTIVES

GradeMark was introduced to the University in the Summer of 2015. I wanted to use this new marking tool to explore different ways of providing feedback for students. In particular, I wanted to adopt a more personal approach and provide more in-depth feedback without significantly increasing the time I spend marking each essay.

CONTEXT

GradeMark allows you to produce typewritten feedback for assessment work and this is what most of us are used to. However, it will also let you click on an icon that allows you to create an audio file of up to three minutes of spoken feedback instead.

IMPLEMENTATION

I started off by making notes as I marked the essay and then talking through them on the audio file. It did not work very well because my feedback became stilted, took longer than three minutes and was time consuming to prepare. I think I lacked confidence at the outset.

Now I take a more relaxed approach. I make no more than a couple of notes (and often not even that) and then I simply press the record button. As I talk to the student I scroll down the assignments on the split screen and this is enough to jog my memory as to what I want to say. Taking a methodical approach has helped me. I always begin with an overview, then work on specific challenges or praiseworthy elements, then end with a brief comment summing up my thoughts. If it
goes wrong, I simply scrap the recording and begin again. I save myself time with the uploading by setting it to upload and then begin to work on the next assignments. This saves the frustration of staring at an upload symbol for ages when you want to get on with it.

IMPACT

It is worth the effort.

For now, students love it. I asked students to let me know whether they would prefer written or audio file feedback and those who responded voted for audio file. The novelty factor might wear off, but I think at the moment it is a useful way to engage students in our assessment criteria and module learning aims, in class and beyond.

For now, I love it. It is a pleasant change; it is quicker and fuller than written feedback. It seems to allow me to range more widely and be more personally responsive to students through their assignments. Because I am ‘talking to them’ I have found myself more ready to suggest other modules they might like, or some further reading that they might enjoy.

REFLECTIONS

It can take a few attempts to ensure that your headphones are working within the system. This is most usually a problem with GradeMark or Blackboard more generally – restarting Blackboard or even your computer will fix it. You might not have headphones already to hand, and that sounds like another investment of time and money, but it’s good idea to buy cheap headphones – they cost around £20 from a supermarket and are perfectly adequate for the job. You feel like a twit talking to your computer. Of course you do – who wouldn’t? After your first few audio files it will feel perfectly natural.

For the future, I can see it having an impact on assignment tutorials. I believe I can have an equal impact via a tutorial or a three minute audio file, and everyone actually listens to their audio file. I am going to have to decide what to do with the extra ‘spare’ contact time this might give
me…

Changing the assessment experience of professional staff in SAPD – Emily Parsons

Profile picture for Emily Parsons

Emily Parsons is a Senior Programme Administrator in the School of Agriculture, Policy and Development (SAPD). Online assessment has been adopted throughout the SAPD, impacting academic and non- academic colleagues. In this case study, Emily outlines the experiences of her Support Centre team working with SAPD as an Early Adopter School.

OBJECTIVES

To reduce the administrative burden of assessment and improve the overall assessment experience for staff within the Support Centre whilst supporting change within the School.

CONTEXT

The University has a long-term vision to move toward online assessment, where practical, and improve underlying processes. SAPD became an Early Adopter School in May 2017 which allowed the EMA Programme to support a significant shift away from a mixture of online and offline marking to the full provision of online marking where practical. The SAPD Support Centre was involved right from the start working collaboratively with the EMA, TEL, CQSD and senior school leadership team during the change process. The Support Centre was one of the first to experience the impact on their working practices of a shift towards greater online marking throughout 2017-2018.

IMPLEMENTATION

As an Early Adopter School, SAPD undertook a full change programme to support online submission, feedback and grading as well as support for all underlying processes. A series of meetings and three major workshops lasting between three and four hours were held throughout the Summer involving all collaborating teams.

Initially only two members of the Support Centre team were involved but representation quickly expanded to include at least four members. It was really important to make sure that a range of professional staff views were being heard during the change planning stage particularly because all of these colleagues would play a role in implementing new processes and delivering change.

Each of these collaborative workshop meetings drew everyone together, in person, in one room instead of relying on e-mail correspondence. This proved far more effective. Relying on e-mail could have significantly delayed the process and may not have led to the kind of in depth, rich discussion around assessment practice, process and policy within the School that was seen at each meeting.

One of the triggers for these debates was the creation of a series of highly detailed process flow diagrams showing the end to end assessment process within the University. These process maps outlined who does what and when in four main diagrams – anonymous marking using the Blackboard marking tool, named marking using the Blackboard tool, anonymous marking using Turnitin, and named marking using Turnitin. These maps were essential to understanding the end to end process and for allowing the School to start thinking about consistent practices.

Following this approach to consistent practice professional staff also created a manual containing essential information such as how to set up submission points or Turnitin similarity reports in the way that the School wanted. All professional staff could then follow this detailed guidance. This proved essential to ensure that all colleagues were working in a similar way.

IMPACT

Two key areas of impact have been experienced within the Support Centre – the first surrounds the adoption of more consistent processes to deal with the submission, receipt, marking and moderation of coursework, and the second surrounds the significant increase in amount of work marked online.

The adoption of more consistent processes was made possible by the creation of the detailed process diagrams outlined above. These show the 45-50 steps involved from submission to final exam board agreement and confirmation, including who does what exactly, and when. The creation of these process diagrams during the Summer workshops, informed by all of the groups involved was, in itself, a useful exercise. We could take a step back and really think about how we could make this process as efficient and as effective as possible whilst keeping an element of flexibility to cover any type of submission or new requirement that we collectively hadn’t thought of!

During the workshops, the Support Centre, in collaboration with the School, was also asked to create a large assessment spreadsheet listing all submissions due to be submitted during the academic year. The creation of this detailed assessment spreadsheet, in itself, provided an opportunity for colleagues to pause and review the amount of assessment and the School’s use of different assessment types.

This was also a crucial starting point from which we could categorise assessment types (such as group work, individual essay, video submission) and then think through which of the two marking tools – Blackboard or Turnitin – would be most appropriate for each type. Both the process diagrams together with these spreadsheets helped to support workflow and planning within the Support Centres who then knew exactly what they had to do and when, for the full academic year.

Under the new, more consistent. processes, academic colleagues were no longer required to create submission points. This role was transferred to professional staff and actually represented one of the most significant changes undertaken. All submission points are now created in the same way -for example there is no longer any variation within the School surrounding student views of Turnitin reports as all students only see similarity reports after the submission deadline. In general, academic colleagues were happy to transfer the set-up of submission points to professional staff and just had to inform the Support Centre, in advance, when assessment was due. Around 400 pieces of assessment were due during 2017- 2018.

Alongside increased consistency surrounding processes, the School has seen significant increases in the amount of work submitted and marked online. Overall this change has improved the assessment experience for colleagues within the Support Centre in a number of ways:

• Previously, using a rota system, colleagues were allocated a time slot to sit in the front office to receive hard copies and process each paper coming in. This was an intense role and so reduced the time available to undertake any other supporting role. There is no need to do this in the current system as submission is managed online for almost all work. This represents a significant time saving for colleagues.

• At the end of the marking process, each paper would also have to be sorted alphabetically and placed in individual envelopes, ready for collection by students. This doesn’t happen now for the vast majority of pieces which are accessed online. In the past this role might have taken half a day. Now it takes an estimated 30 minutes for the small amount of assessment still marked in hard copy. The time saved has been described by professional staff within the team as “extraordinary”.

• This also means that the assessment process has become much more scalable. Support Centres can cope with increases in students without seeing significant increase in workload.

• The Support Centre used to ask academic colleagues to return marked work to them within 14 working days of submission to allow time for processing. There is no need to do this anymore because the marks and feedback are returned online so academic colleagues now have the full 15 working days to mark submitted work,

• The Support Centre is no longer drowning in a sea of paper leaving much more room and saving storage space. This was a particular problem when students failed to come back to collect their work.

• Some of the functions of the marking tools are saving a significant amount of time for the Support Centre. One example surrounds non-submission. It took a considerable amount of time to contact students who had failed to submit work when they were submitting hard copies. Now Turnitin allows professional staff to send one e-mail to all non-submitters easily and very quickly.

• Previously, in order to undertake internal moderation, Support Centre staff would release marks but keep the hardcopy coursework, which included their feedback, back from the students until internal moderation had taken place. After this point, the full feedback would also be released. In order to undertake external moderation, Part 2 and Part 3 students were asked to create a portfolio of their work, including marks and feedback, and submit this at the end of the academic year so that external examiners could review the work. Student engagement in this process was variable with some students having lost their work by this point. In addition, these processes generated a huge amount of paper and took a large number of working hours to manage. This isn’t necessary anymore, aside from a very small amount of fieldtrip work. Internal and external moderators can access both marks and feedback quickly and easily online, from wherever they are in the country.

REFLECTIONS

Moving the School towards more consistent approaches to managing assessment and increasing online marking and feedback has largely been a very positive experience for the Support Centre. We are now enjoying a range of benefits which have made our role within the assessment cycle much more manageable.

We had worried that some areas of work might increase – for example, we might have seen more reported cases of academic misconduct as a result of much greater use of Turnitin similarity reports. This has not occurred but the School had been undertaking a significant amount of work in this area including the introduction of a formative piece of work at Part 1 and at the start of the MSc programmes which is then analysed during follow on seminars.

As we move forward into the next academic year, there are still some areas that we need to think about a little more. We’ve discovered through this processes, for example, that there are multiple different ways in which academic colleagues assess and give feedback on presentations. We need to work on understanding the processes in this area more in 2018-2019.

This year we will also be able to start the process of collecting new assessment data and deadlines much earlier. This will enable us to create submission points around July and August. This will place us in a better position to plan ahead for 2018-2019.

Reflecting on change and the management of non-standard submissions in Typography – Dr Jeanne-Louise Moys

Profile picture for Dr Moys

Jeanne-Louise teaches design practice, theory and research skills across a range of genres and platforms. She is the Programme Director for the MA Creative Enterprise and the Pathway Lead for the MA Communication Design (Information Design Pathway).

OBJECTIVES

Typography has been keen to continue to support the move from offline to online submission, feedback and grading, where possible. In particular, the Department has wanted to ensure a more consistent and streamlined approach to managing assessment, especially given the range of diverse submission types within Typography programmes. The Department were also very keen to ensure that online marking tools allowed colleagues to provide feedback that supports students’ design literacy. In this respect, markers aim to give feedback designed to allow for openness in the ways students think and that builds students’ confidence to develop their own design judgement.

CONTEXT

The University has a long-term vision to move toward online assessment, where practical, and improve underlying processes. In 2015–6, the Department of Typography adopted a policy of either online submission or dual submission (where students are asked to submit both an online digital ‘copy’ and in material form as relevant to the particular deliverables of different design briefs) across the undergraduate degree. Paper-based feedback forms were replaced with online rubrics. The Department mainly made use of Blackboard as a marking tool but with some further use of
Turnitin, particularly for essay based assessment. The Department has undertaken this change in the context of growing student numbers, increasing diversity of student cohorts and growing numbers of international students. The trends have increased the need to adopt more efficient and streamlined assessment processes.

IMPLEMENTATION

Over the past four years the Department has supported student online submission and the increased use of marking tools. In 2014, The Head of Department and I initially worked together to explore different online tools to find sustainable assessment practices for increasing cohorts. We liaised with our IT partners who encouraged us to work with Maria Papaefthimiou – as they were aware that the University was setting up a new TEL team. Maria introduced us to Blackboard rubrics, which we piloted for both practical and written forms of assessment.

These early initiatives were reviewed ahead of our decision to adopt online assessment for all undergraduate coursework (with a few exceptions such as technical tasks, examinations and tasks where self or peer assessment plays a particular role in the learning process). I then translated our paper-based forms into a set of Blackboard rubric templates for colleagues to work with and provided a workshop and video resources to support the transition.

For almost every submitted piece of work, students receive feedback from colleagues using either Turnitin or the Blackboard marking tool. Each piece has an online submission point so that colleagues can provide feedback online, often using the rubrics function within the Blackboard marking tool.

One of the challenges faced by the Department has been managing non-standard types of submission. Typography employs a particularly broad range of assessment types including self- and peer-assessment and group work. It also handles a range of different physical submissions such as books or posters and assessment involving creating designs like websites and app prototypes that exist only in digital form.

Because of the nature of the work, dual submission is common. Our policy of online submission for written work and dual submission for practical work ensures that – regardless of the nature of the work – students receive feedback and grades in a consistent manner throughout their degree.

More recently, we have introduced some new practices that support the development of professional skills and enhance the transparency of group work. For example, professional practice assignments use a project management app, Trello. Students are assessed on their usage and the content (including reflection) they input into the app. The tutor can, for example, set up a Trello group and monitor group activity. Some practical modules require students to use prototyping software or create videos. In these cases, it might be easier for students to share links to this content either by submitting the link itself online to Blackboard or to a dedicated Typography submission e-mail address monitored by administrative colleagues (although this second approach may change as we work with the EMA Team).

A second issue faced by the Department during implementation, as a result of the significant diversity of assessment, is that the management of online submission can become confusing for students in terms of what exactly they should submit and how. The diversity of assessment allows students to demonstrate a range of learning outcomes and broad skills base but the Department has had to ensure that students fully understand the range of submission practices. This challenge exists both in Part 1 when students are being introduced to new practices and in Parts 2 and 3 where a single design brief may have multiple deliverables. We are continually working to find the best balance between ensuring the kind of submission is always appropriate to the learning outcomes, provides students with experience in industry standard software and tools, and is accompanied by clear guidance about submission requirements.

IMPACT

The shift from offline to online assessment within the Department has led to a range of changes to the staff and student experience:

1. Online feedback for students has meant that they now always know where their feedback is. There is no need for them to contact their tutors to access content.

2. For some staff, the use of online marking and feedback has meant spending some time getting used to the interface and learning about the functionality of the tools, particularly the Blackboard marking tool. There have been some issues surrounding the accessibility of rubrics within Blackboard and their consistent use, which the Department has had to work through. In general colleagues are now reporting that online marking has significantly reduced marking time, especially where more detailed rubrics have
been developed and trialled in the current academic year.

3. The Department has spent time thinking carefully about the consistency of the student assessment experience and making the most of the functionality of the tools to make marking easier and, potentially, quicker. As a result, there is a sense that the practices adopted are more sustainable and streamlined, which has been important given rising student numbers and increasingly diverse cohorts.

REFLECTIONS

Over the last year, following recommendations from Periodic Review, the Department has been trialling different practices such as the creation of much more detailed rubrics. As noted above, detailed rubrics seem to reduce marking and feedback time, while providing students with more clarity about the specific criteria used to assess individual projects. However, these do not always accommodate the range of ways in which students can achieve the learning outcomes for creative briefs or encourage the design literacy and independent judgment we want students to develop.
We are also working on ensuring that the terminology used in these rubrics is mapped appropriately to the level of professional skill expected in each part of the degree. The Department is currently looking at the impact of this activity to identify best practice.

Typography is keen to continue to provide a range of assessment options necessary for developing professional skills and industry- relevant portfolios within the discipline. We are committed to complementing this diversity with an assessment and feedback process that gives students a reassuring level of consistency and enables them to evaluate their performance across modules.
There is some scope to develop the marking tools being used. It would, for example, be very helpful if Blackboard could develop a feature where students can access their feedback before they can
see their marks or if it allowed colleagues to give a banded mark (such as 60-64), which is appropriate formative feedback in some modules. In addition, Typography students have reported that the user experience could be improved and that the interface could be more intuitive. For example, it could contain less layers of information and access to feedback and marks might be more direct.

More broadly, the shift from offline to online practices has been one driver for the Department to reflect on existing assessment practices. In particular, we have begun to consider how we can better support students’ assessment literacy and have engaged with students to review new practices. Their feedback, in combination with our broader engagement with the new Curriculum Framework and its impact on Programme Level Assessment, is informing the development of a new set of rubric templates to be adopted in autumn 2018.

LINKS

For further information please see the short blog, ‘Curriculum Review in Practice Aligning to the Curriculum Framework-first steps started at:
http://blogs.reading.ac.uk/engage-in-teaching-and- learning/2018/04/09/curriculum-review-in-practice-aligning-to- the-curriculum-framework-first-steps-started-by-jeanne-louise- moys-rob-banham-james-lloyd/

Pre-sessional English use of Turnitin’s online marking tool – Rob Playfair, IFP Course Tutor

OBJECTIVES

I was interested in improving the efficiency of my marking, and liked the idea of having a digital record of written feedback to students. During the PSE induction for new tutors we were told that the University is moving towards e-feedback over the next few years so it seemed like a useful skill to acquire.

CONTEXT

My group of international students were on a 9 week course to improve their level of English before starting their postgraduate studies. They needed to write three 500 word essays and one 1500 word project. For each of these, students wrote two drafts. I needed to provide written feedback on both drafts and the final version of each essay, i.e. a lot of marking!

IMPLEMENTATION

Jonathan Smith, PSE course director and ISLI TEL Director, gave all teachers a one-hour workshop on how to use Turnitin and Grademark, during which we had a chance to get hands on with the software. Each year Jonathan runs a training session for new members of PSE staff who will work on the PSE courses during the summer.

Later, Jonathan shared the PSE ‘QuickMarks’, with those of us who had opted to use e- feedback. We could download these, via our QuickMarks library, into our own personal QuickMarks set. These comments were then available each time we opened an essay.

The QuickMarks focussed on common student errors with explanations and links to relevant sources. ‘Quickmarks’ are based not only on common grammar and lexical errors but also on the complexity of the structures used and coherence and cohesion in the texts.

Students grew accustomed to submitting work, accessing feedback and seeing their progress.

IMPACT

• It was quicker to note common student errors in-text using the QuickMarks, than repeatedly hand writing the same comments.

• Students were able to read & start acting on my feedback as soon as I did it, rather than waiting until the next class.

• I could quickly refer to previous drafts and the comments I had given to monitor uptake.

• I could browse work from students who were not in my class, via the Turnitin feedback suite, to see a broader range of essays and also see the feedback that colleagues were giving because, in this case, the point of submission was the same for the whole cohort. As this was my first experience teaching the programme, this was particularly useful.

REFLECTIONS

The speed of communication with students was the biggest benefit – as soon as my marking was done students could see it. This meant that students could formulate questions about my feedback before class, making the time in class much more productive.

In terms of quality of marking I think there might be a tendency to over-mark using the QuickMarks, because it only takes a second to add a one yet creates quite a lot for the student to do – reading an explanation and perhaps visiting a website. I’d like to explore the impact of this on uptake.

Finally, on a practical level I found this helped my organisation – all the scripts, scores and comments are in one place. It was also easier to submit scripts for moderation: I just gave the names of students to the Course Directors who could go into the system and see the scripts themselves.

FOLLOW UP

• I’m currently using it in a similar way on the International Foundation Programme (IFP).

• At present all students can do is upload their work then download my comments. I’d be interested in a function which allows students to respond to my comments – making corrections or asking questions. This would support the feedback cycle.

• To improve the reliability of the summative scores, I wonder whether we can learn from elements of comparative judgment programmes such as No More Marking.

LINKS

www.nomoremarking.com

http://www.reading.ac.uk/internal/ema/ema-news.aspx

https://www.reading.ac.uk/ISLI/study-in-the-uk/isli-pre-sessional-english.aspx

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 1)

Dr Madeleine Davies, School of Literature and Languages

Overview

The Department of English Literature (DEL) is organising student focus groups as part of our TLDF-funded ‘Diversifying Assessments’ project led by Dr Chloe Houston and Dr Madeleine Davies. This initiative is in dialogue with Curriculum Framework emphases engaging students in Programme Development and involving them as stakeholders. This entry outlines the preparatory steps taken to set up our focus groups, the feedback from the first meeting, and our initial responses to it.

Objectives

  • To involve students in developing a more varied suite of assessment methods in DEL.
  • To hear student views on existing assessment patterns and methods.
  • To gather student responses to electronic methods of assessment (including learning journals, blogs, vlogs and wikis).

Context

We wanted to use Curriculum Framework emphases on Programme Review and Development to address assessment practices in DEL. We had pre-identified areas where our current systems might usefully be reviewed and we decided to use student focus groups to provide valuable qualitative data about our practices so that we could make sure that any changes were informed by student consultation.

Implementation

I attended a People Development session ‘Conducting Focus Groups’ to gather targeted knowledge about setting up focus groups and about analytical models of feedback evaluation. I also attended a CQSD event, ‘Effective Feedback: Ensuring Assessment and Feedback works for both Students and Staff Across a Programme’, to gain new ideas about feedback practice.

I applied for and won TLDF mini-project funding to support the Diversifying Assessments project. The TLDF funding enabled us to regard student focus groups as a year long consultative process, supporting a review of assessment models and feedback practices in DEL.

In Spring Term 2017, I emailed our undergraduate students and attracted 11 students for the first focus group meeting. We aim to include as diverse a range of participants as possible in the three planned focus group meetings in 2016-17. We also aim to draw contributors from all parts of the undergraduate programme.

To prepare the first focus group:

  • I led a DEL staff development session on the Diversifying Assessment project at the School of Literature and Languages’ assessment and feedback away day; this helped me to identify key questions and topics with colleagues.
  • I conducted a quantitative audit of our assessment patterns and I presented this material to the staff session to illustrate the nature of the issues we aim to address. This tabulated demonstration of the situation enabled colleagues to see that the need for assessment and feedback review was undeniable.

At the first focus group meeting, topics and questions were introduced by the two project leaders and our graduate intern, Michael Lyons, took minutes. We were careful not to approach the group with clear answers already in mind: we used visual aids to open conversation (see figures 1 and 2) and to provide the broad base of key debates. We also used open-ended questions to encourage detail and elaboration.

Group discussion revealed a range of issues and opinions that we would not have been able to anticipate had we not held the focus group:

  • Students said that a module’s assessment pattern was the key determinant in their selection of modules.
  • Some students reported that they seek to avoid exams where possible at Part Two.
  • Discussing why they avoid exams, students said that the material they learn for exams does not ‘stick’ in the same way as material prepared for assessed essays and learning journals so they feel that exams are less helpful in terms of learning. Some stated that they do not believe that exams offer a fair assessment of their work.
  • Students wholly supported the use of learning journals because they spread the workload and because they facilitate learning. One issue the students emphasised, however, was that material supporting learning journals had to be thorough and clear.
  • Presentations were not rated as highly as a learning or assessment tool, though a connection with employability was recognised.
  • Assessed essays were a popular method of assessment: students said they were proud of the work they produced for summative essays and that only ‘bunched deadlines’ caused them problems (see below). This response was particularly marked at Part Two.
  • Following further discussion it emerged that our students had fewer complaints about the assessment models we used, or about the amount of assessment in the programme, than they did about the assessment feedback. This is represented below:

To open conversation, students placed a note on the scale. The question was, ‘Do we assess too much, about right, not enough?’ (‘About right’ was the clear winner).

Students placed a note on the scale: the question was, ‘Do we give you too much feedback, about right, or too little?’ (The responses favoured the scale between ‘about right’ and ‘too little’.)


The results of this exercise, together with our subsequent conversation, helped us to understand the importance of feedback to the Diversifying Assessment project; however, subsequent to the focus group meeting, the DEL Exams Board received an excellent report from our External Examiners who stated that our feedback practices are ‘exemplary’. We will disseminate this information to our students who, with no experience of feedback practices other than at the University of Reading, may not realise that DEL’s feedback is regarded as an example of best practice by colleagues from other institutions. We are also considering issuing our students with updates when assessed marking is underway so that they know when to expect their marks, and to demonstrate to them that we are always meeting the 15-day turnaround. The external examiners’ feedback will not, however, prevent us from continuing to reflect on our feedback processes in an effort to enhance them further.

Following the focus group meeting, we decided to test the feedback we had gathered by sending a whole cohort online survey: for this survey, we changed the ‘feedback’question slightly to encourage a more detailed and nuanced response. The results, which confirmed the focus group findings, are represented below (with thanks to Michael Lyons for producing these graphics for the project):

A total of 95 DEL students took part in the survey. 87% said they valued the opportunity to be assessed with diverse methods.

Assessed essays were the most popular method of assessment, followed by the learning journal. However, only a small proportion of students have been assessed with a learning journal, meaning it is likely that a high percentage of those who have been assessed this way stated it to be their preferred method of assessment.

On a scale from 0-10 (with 0 being too little, 5 about right, and 10 too much), the students gave an average score of 5.1 for the level of assessment on their programmes with 5 being both the mode and the median scores.

34% found the level of detail covered most useful in feedback, 23% the feedback on writing style, 16% the clarity of the feedback, and 13% its promptness. 7% cited other issues (e.g. ‘sensitivity’) and 7% did not respond to this question.

66% said they always submit formative essays, 18% do so regularly, 8% half of the time, 4% sometimes, and 4% never do.

40% said they always attend essay supervisions (tutorials) for their formative essays, 14% do so regularly, 10% half of the time, 22% sometimes, and 14% never do.

Impact

The focus group conversation suggested that the area on which we need to focus in DEL, in terms of diversification of assessment models, is Part Two assessment provision because Part One and Part Three already have more diversified assessments. However, students articulated important concerns about the ‘bunching’ of deadlines across the programme; it may be that we need to consider the timing of essay deadlines as much as we need to consider the assessment models themselves. This is a conversation that will be carried forward into the new academic year.

Impact 1: Working with the programme requirement (two different types of assessment per module), we plan to move more modules away from the 2000 word assessed essay and exam model that 80% of our Part Two modules have been using. We are now working towards an assessment landscape where, in the 2017-18 academic session, only 50% of Part Two modules will use this assessment pattern. The others will be using a variety of assessment models potentially including learning journals and assessed essays: assessed presentations and assessed essays: vlogs and exams: wikis, presentations and assessed essays: blogs and 5000 word module reports.

Impact 2: We will be solving the ‘bunched’ deadlines problem by producing an assessments spread-sheet that will plot each assessment point on each module to allow us to retain an overview of students’ workflow and to spread deadlines more evenly.

Impact 3: The next phase of the project will focus on the type, quality and delivery of feedback. Prior to the Focus Group, we had not realised how crucial this issue is, though the External Examiners’ 2017 report for DEL suggests that communication may be the more crucial factor in this regard. Nevertheless, we will disseminate the results of the online survey to colleagues and encourage more detail and more advice on writing style in feedback.

Anticipated impact 4: We are expecting enhanced attainment as a result of these changes because the new assessment methods, and the more even spread of assessment points, will allow students to present work that more accurately reflects their ability. Further, enhanced feedback will provide students with the learning tools to improve the quality of their work.

Reflections

Initially, I had some reservations about whether student focus groups could give us the reliable data we needed to underpin assessment changes in DEL. However, the combination of quantitative data (via the statistical audit I undertook and the online survey) and qualitative data (gathered via the focus groups and again by the online survey) has produced a dependable foundation. In addition, ensuring the inclusion of a diverse range of students in a focus group, drawn from all levels of the degree and from as many communities as possible within the cohort, is essential for the credibility of the subsequent analysis of responses. Thorough reporting is also essential as is the need to listen to what is being said: we had not fully appreciated how important the ‘bunched deadlines’, ‘exams’, and ‘feedback’ issues were to our students. Focus groups cannot succeed unless those convening them respond proactively to feedback.

Follow up

There will be two further DEL student focus group meetings, one in the Autumn Term 2017 (to provide feedback on our plans and to encourage reflection in the area of feedback) and one in the Spring Term 2018 (for a final consultation prior to implementation of new assessment strategies). It is worth adding that, though we have not yet advertised the Autumn Term focus group meeting, 6 students have already emailed me requesting a place on it. There is clearly an appetite to become involved in our assessment review and student contribution to this process has already revealed its value in terms of teaching and learning development.

Using online learning journals

Dr Nicola Abram, School of Literature and Languages

n.l.abram@reading.ac.uk

Overview

This entry describes the use of online Learning Journals on a Part Three English Literature module. This method of assessment supports students to carry out independent research and to reflect on their personal learning journey, and rewards students’ sustained engagement and progress.

Objectives

  • To encourage reflective learning.
  • To promote independent learning.
  • To facilitate weekly cumulative contributions to summative assessment.
  • To reward development rather than final attainment.

Context

The Part Three optional module Black British Fiction (EN3BBF) is characterised by a large number of set texts that are read at a fast pace. During a single term it covers the period from 1950 to the present day, and asks students to engage with novels, short stories, poetry, a play, and a film, as well as critical theory, history, autobiography, documentary, blogs, political speeches, and press reviews. The module is also characterised by its relevance to historical and contemporary issues of social justice. The quantity and complexity of this material requires students to exercise their independence, taking responsibility for their learning beyond the weekly three hours of tutor-led seminars.

Learning Journals had been in use for this and other modules in the Department of English Literature for several years, in the format of paper workbooks pre-printed with set questions. This effectively served the purpose of structuring students’ weekly studies and directing discussion in seminars. Students worked extremely hard to record their learning in this format, often going beyond the standard material to include additional reading and research of relevance to the module.

However, the paper workbook sometimes resulted in an excess of material that was diluted in focus and difficult to evaluate. Another problem was that the handwritten Journal was retained by the University after submission, meaning students lost this rich record of their learning.

To improve this situation, consultations were held with colleagues in the Department of English Literature and an alternative online Learning Journal was initiated in 2015/16.

Implementation

Experimentation with the Blackboard Journals tool helped to clarify its privacy controls, to ensure that tutors could see the work of all participating students but that students could not see each other’s entries. A discussion with the University of Reading TEL team clarified marking procedures, including making the Journal entries available to view by external examiners.

A discussion was held with colleagues who use paper or online Learning Journals, to establish generic assessment criteria and ensure parity of expectations.

In discussion with another module convenor it was decided that students would be required to submit ten weekly entries, each consisting of 400-500 written words or 4-5 minutes of audio or film recording. The choice of media was a proactive effort to make the Journal more accessible to students with dyslexia and those for whom English is an additional language. The subject of each entry could be determined by the student, prompted by questions on the reading list, discussion in seminars, personal reading, or other activities such as attendance at an exhibition or event.

In the first term of implementation (Autumn 2015) the full ten entries were assessed. In later iterations it was decided that students should instead select five entries to put forward for summative assessment. The selection process facilitates further self-reflection, and the option to discard some entries allows for experimentation without the threat of penalty.

The Learning Journal incorporates a vital formative function: students are invited to a 30-minute feedback tutorial to discuss their first five entries. This conversation refers to the module-specific and task-specific assessment criteria, supporting students to reflect on their work so far and to make plans to fill any gaps. The Learning Journal functions as a mode of assessment for learning, replacing the traditional task of the formative essay.

In terms of summative assessment, the five submitted Learning Journal entries account for 50% of the module mark. An essay constitutes the other 50%. These two forms of assessment are equivalent in scale, with each carrying a guideline of 2,500 words total.

Impact

The fact that students could nominate a selection of entries for summative assessment seemed to encourage risk-taking. Students were more willing to experiment with their critical responses to texts – by testing speculative interpretations, asking questions, or articulating uncertainty – and to express their ideas using creative practices. They became actively engaged in directing both the form and content of their learning.

The move to a restricted length per entry was designed to encourage students to distil their ideas, and to direct attention to the aspects of that week’s learning that most mattered to the student. This was successfully achieved, and feedback shows that they could see their own progress as the weeks passed.

Feedback also showed that students appreciated the opportunity to choose their own topic for each weekly entry, without the constraints of set questions. As a result, entries were remarkably varied. Some students took the opportunity to reflect on their personal circumstances or current political contexts (such as the construction of ‘Britain’ in the discourse around the EU referendum in 2016) using the technical vocabulary learned on the course; others explored creative media such as spoken word poetry. All students gained skills in a genre of writing different from the traditional essay format, which may prove useful for careers in the communication industries.

One unexpected benefit was that the online journal made it possible for the module convenor to track the students’ learning in real-time rather than waiting for summative assessments and end-of-term evaluations. This immediate insight enabled corrective action to be taken during the course of the module where necessary.

Reflections

Students were initially nervous about this unfamiliar method of assessment. Providing detailed module-specific and task-specific marking criteria, as well as example entries, helped to allay these fears. The decision to count only a selection of entries towards summative assessment significantly helped, allowing students to acclimatise to the task with more confidence. As the term progressed, students visibly transitioned towards autonomous learning.

The Learning Journal format proved particularly effective for this module as it created a ‘safe space’ in which students could reflect on the ways in which they have personally experienced, witnessed, or practised racism. Students’ self-reflection extended beyond the subject of skills, strengths and weaknesses to consider their embodied knowledge, ignorance, or privilege. They became more critical in their thinking and more alert and responsible as citizens. Articulating the potency of this real-world engagement, one student commented that “the consistency of the learning journal […] allowed my thinking to naturally mature and changed my outlook on society”.

Marking the Journals became much more efficient using the online format, as entries were typewritten and significantly condensed. Additionally, marking and moderating could be done remotely, without the need to exchange cumbersome documents in person.

It is striking that some students achieving high marks in their Learning Journals did not always achieve equivalent marks in their essays or other modules. I do not consider this to indicate an artificial inflation of grades; rather, I would argue that the Journal recognises and rewards skills that are overlooked in traditional assessment formats and undervalued elsewhere on our programmes. Some students used the Journal to record their personal contribution to seminar discussions and be rewarded for this, while for other students less likely to speak in class (perhaps due to EAL status, gender, disability, or personality) the private entries provided an important opportunity for their insights to be heard.

Follow up

Informal spoken feedback on the general use of Learning Journals was given to the group during seminars, and one-to-one feedback was given halfway through the module. However, several students sought additional reassurance about their entries. In 2017/18 I intend therefore to incorporate a peer-review exercise into the early weeks of the term, to allow students to benchmark their work against others’ and to promote the take-up of alternative media and approaches. This activity will help students to see themselves as a community of learners. Rather than presume that students have access to technology I will supply iPads belonging to the School of Literature and Languages for use in the classroom.

I also intend to circulate example entries in audio and video formats, to show that the Journal validates skills other than traditional essay-writing and to encourage students to experiment with alternative ways of demonstrating their learning.

E-submission, marking and feedback – Pilar Gray-Carlos

OBJECTIVES

  • To facilitate the administrative process in submission of summative assessment
  • To inform module convenors and language teaching fellows of the tools supported by the University LMS Blackboard Learn
  • To provide the opportunity to apply the above tools, gather experience and inform decision on best approaches and best practice
  • To explore usability and applicability of existing marking criteria in the form of Tii (Turnitin) rubrics
  • To explore and facilitate a transition to use a basic set of QuickMarks across the Department whilst enabling room to create language specific amendments
  • To facilitate timely and transparent accessibility of results for students via the Grade Centre

CONTEXT

As part of summative assessment, students of intermediate to advanced language courses in IWLP Chinese, French, Italian, Japanese, German and Spanish submit a project (between 600 and 1000 words or characters according to language and stage) researched and written in the Target Language.

IWLP deals with a large volume of students each year so it was important to explore ways of facilitating a point of submission that would enable staff to easily follow up submission deadlines and late submissions eliminating paper based trails and multiple parties involved in the process, making it timely and easily accessible for staff to keep track of submission.

As the majority of language teaching staff works on part-time basis, it was felt that it would be of advantage to have a point of access to student´s work from different locations. This also meant adopting electronic marking and feedback as a way to facilitate marking and moderation remotely.

Three years ago it was unclear whether Tii would support the modern languages provided by the IWLP programme. Once it was established that it did support modern languages, it was felt that the use of similarity reports would both assist teachers in detecting plagiarism and be good for student learning as it would force students to revise not just content but language as well and re-write when necessary.

One of the advantages of using electronic submission, marking and feedback is that both the marking criteria and the feedback can be provided in the same space, therefore avoiding reprinting and waiting for students to collect feedback. Language projects are assessed on the following areas: content, structure, vocabulary, grammatical accuracy, range of expression, syntax and variety of grammatical structure. The aim was to upload the project marking criteria in the form of rubrics hence facilitating all the tools for marking and feeding back in one place for tutors, providing an area readily available for moderation, and granting ease of access to results and feedback for students.

There were two e-submission options to be explored: e-submission with inline grading or e-submission via Tii assignment submission, the latter supplying the facility to use rubrics and quick marks via the Turnitin Suite.

IMPLEMENTATION

Initial meetings took place three years ago with members of the TEL team which highlighted the advantages of using the electronic submission of written work. The meetings involved coordinators and module convenors of the languages that initially provided intermediate to advanced stages: English for Erasmus, French, German and Spanish. It was then agreed to pilot the use of electronic submission and to initially explore the use of “inline marking” tools for marking and providing feedback.

Further training was arranged, delivered by both the TEL team and Pilar Gray Carlos and on-going support was provided on an ad-hoc basis.

The first round of assessments took place and the feedback collected from tutors was varied. Some colleagues developed feedback systems utilising tools such as colour underlying and text boxes. As not only the content but the language is assessed, and identifying, correcting and explaining language mistakes can be a detailed process it was felt that, not only it took time to get familiar with the new system but that the result of the corrections and feedback was not easily accessible to students, making it necessary to print student´s work and go over corrections and feedback again with students in the classroom.

A period of required e-submission, but voluntary use of electronic marking and feedback followed until there was confirmation that Tii supported other modern languages. At this time modern languages such as Mandarin Chinese and Japanese had added intermediate courses to their provision. It was then decided to take the opportunity to start using Tii also as a formative tool, and in doing so, familiarising students with its use and enabling them to self-evaluate and readdress their own work. The use of similarity report was enabled for formative submission during the course and in view of the final submission of summative coursework.

Opportunities for training by the TEL team and in-house training were organised and provided by Pilar Gray Carlos and more experienced colleagues within ISLI. In this way module convenors and tutors were shown how rubrics and QuickMarks are used for marking and feed back in language teaching (see Rob Playfair case study and Jonathan Smith´s interview).
At the same time, and in parallel with work on e-submission, marking and feedback, a Grade Centre was created for the 31 modules provided in the 10 different languages. Weighted columns were created per assessment per module, teachers could directly input results and students would have direct access to marks as they were released.
Having all that data available also meant that, although limited, some reports could be printed with regards to module performance per assessment and even for languages where classes are taught in parallel groups, group performance data reports could be produced.

Since then the EMA Core Systems Team has delivered a more streamlined process which produces similar data sets on RISIS (for more detailed information see the EMA Programme short videos link below).

IMPACT

The use of e-submission has enabled a variety of approaches to formative assessment to flourish, some languages have made the most of using e-submission to collect student´s work and to feedback on line.

As per summative assessment, the adoption of QuickMarks is facilitating marking, and once the teachers get accustomed to using them it becomes an efficient way to point out generic language errors.

The use of the Grade Centre was a success, as it cut down on administration, freeing time on the side of administrators and teachers and it provides helpful information as to the performance of certain cohorts and groups. The only drawback was the missing step between Grade Centre and RISIS. At that point in time the only way to update records in RISIS was by downloading all marks in the form of a spreadsheet and manually inputting them in RISIS. The EMA Programme Core Systems Workstream are working to improve the integration between Blackboard and RISIS.

REFLECTIONS

The feedback obtained from the teachers indicates that there is a healthy satisfaction surrounding e-submission, it is also positive with regards to marking content but it is divided about how to approach correction and feedback on language items as they can be as particular as the individual but also as the language itself. In this sense written adjustments and examples need to be inserted in the text, an option that seems to be faster in paper rather than electronically but in the very specific context of inserting grammatical symbols in a text in language teaching there might be some additional thinking. The EMA Team are looking at requirements surrounding scientific, mathematical and grammatical type notations within the University and possible ways
forward.

The general consensus is that at present out of the two options Tii is a better option for language projects than inline marking. In order to enable that transition we need to look into the set of rubrics we are using and adopt sets of QuickMarks applicable to all languages, with perhaps addition of specific sets for non-Latin language scripts.

FOLLOW UP

There will be a small working group set up to revise QuickMarks across all languages. This working group will also look into the rubrics and how can we best customise them for our assessment purposes and in line with CEFR (Common European Framework of Reference for languages)

LINKS

EMA Project Reading Resources

http://www.reading.ac.uk/internal/ema/ema-resources.aspx

Common European Framework of Reference for Languages (CEFR)

https://www.coe.int/en/web/common-european-framework-reference- languages/?

Using quickmarks and rubrics in online assessment – Catherine Foley

Catherine Foley is a lecturer in Primary Maths Education in the Institute of Education. She is Director of the Primary School Direct programme which trains people to be teachers whilst they are working in schools.

Image of Catherine Foley

OBJECTIVES

Catherine describes her experience of using the Feedback Studio to move from Word-based marking an assignment to full use of Grademark.

CONTEXT

Catherine Foley is a lecturer in Primary Maths Education in the Institute of Education. She is Director of the Primary School Direct programme which trains people to be teachers whilst they are working in schools. Her experience of electronic marking relates primarily to a 20 credit postgraduate module which is part of this programme, developing the reflective practice and critical thinking of the trainees. The module is assessed through one piece of written work which is assessed formatively and summatively and is taken by approximately 80 students each year.

IMPLEMENTATION

Up until the current academic year, although students would submit their work through Turnitin (for both formative and summative attempts), they would receive feedback in the form of underlined grading sheets and text-based comments which would be completed for each student and uploaded to be released to them via Grade Centre. As with other IoE programmes, all submission, grading and feedback for this assessment is now carried out electronically.

This year, we decided to use the full electronic feedback option for both assessments since the first formative experience would give students (and staff) the chance to get used to the system. We
developed our own rubric for the assessment. For the formative assessment, we decided not to use quickmarks but just to focus on becoming familiar with using the rubric. For the summative
assessment, both a rubric and quickmarks were used: the quickmark set is the same as that used for other initial teacher training programmes.

In my own marking, I found it helpful, when getting started, to open out the full rubric in a grid from the sidebar in the feedback studio. After a while, I was clear what the different statements meant and so could use the sliders more confidently.

IMPACT

  • Speed of marking. Although marking has not been any quicker so far overall, it is likely that this will speed up as the administrative problems are ironed out and we get to know the
    system. Not having to save individual files saves a lot of time which can be spent on quality feedback.
  • Ease of moderation. Because all the assessment and feedback is in the same place, it is much more straightforward and a module convenor is easily able to quality-assure the marking
    that is taking place.
  • Curriculum review opportunity. Developing our own rubric for the assessment encouraged us to review what we had been doing. It made use stop and examine our taken-for-granted practice.
  • Student ownership of feedback. We had a workshop on developing academic writing and it was interesting to see all the students with their laptops open, looking at very specific
    pieces of contextualised feedback received online for their first assignment.
  • Using rubric reports for bespoke study advice sessions. We used the function in Turnitin to generate a report on how well students had achieved as a cohort in relation to the different
    rubric themes. We sent the report to one of the study advisers who was then able to use this to pinpoint areas to focus upon in helping students work towards their next assignment.

REFLECTIONS

Many of the challenges we experienced were due to the fact that the assessment is marked by five different members of staff:

  • When we were using Word-based documents for feedback, we could shape and guide the feedback which tutors were giving more easily (for example with a writing frame). In the feedback studio, the text comment box presents markers with a blank space so it has been harder to ensure a common approach across markers. We therefore agreed a common structure for feedback in this box.
  • The marking team had differing levels of experience with electronic marking. Because the quickmark set had to be uploaded by each marker to their Blackboard account and not all markers were present on campus at the same time, this was a logistical challenge.
  • With the options for quickmarks, rubric statements and open text comments, it would be easy for markers to over-assess each piece of work. Our agreement was that, since students were getting extra feedback in terms of the first two kinds of feedback, the final text comments should be brief and simply recognise specific areas of success then pinpoint areas for
    development.
  • Limitations in functionality of the feedback studio. Some markers liked to be able to use Word to check the number of times a student has used a particular phrase or look at the
    consistency between citations and references: you can’t currently move around the document so easily (unless you download it). Some warning or confirmation messages from
    the system (for example when moving onto the next piece of work) would make it still more user-friendly. With several people involved in marking an assignment, it is easy for markers
    to accidentally change each other’s grades – it would be helpful if grades and comments could be ‘locked’ in some way. Are different levels of access possible, so that external examiners can see the feedback studio but without being able to change feedback?
  • There are still issues (mostly to do with administrative protocols) to iron out. The IoE is currently reviewing its moderation processes and determining the extent to which
    students know they have been included. Programme directors are working with their admin teams to determine exactly how
    academics will be informed when an ECF assignment has been submitted.