Reframing Identity 360

Kate Allen, Department of Art, k.allen@reading.ac.uk

Overview

An investigative artwork that explores identity using 360 cameras developed through practical, alumni led workshops and socially engaged art with current art students, school groups and the general public. Part of ArtLab Movement’ at Tate Exchange (TEx) 2019 at the Tate Modern on March and be archived on the ArtLab website.

Objectives

- Contribute to live art event/out-reach work experience led by Alumni at Tate Exchange 1-3 March 2019

- Explore identity capture with 360 cameras

- 360 cameras experimentation including designing, capturing, printing and editing.

- Create portraits with purpleSTARS, people with learning disabilities and children from Widening Participation schools in Reading.

Context

Reframing Identity explored self-portraits in shot in 360, developed as a response to Tania Bruguera’s Turbine Hall Commission concerning institutional power, borders and migration. Can 360 self-portraits raise awareness of how interconnected we are, when no person is ever behind the 360 camera, everyone is included.

Implementation

Alumni and Virtual Reality artist Kassie Headon researched ideas in response to Tania Bruguera installation at Tate Modern inspired by Bruguera’s ideas on inclusion, connecting to Kate Allen’s research with purpleSTARS a group of people with and with learning disabilities who aim to make museums more inclusive. Kassie demonstrated to students and purpleSTARS how to use the GoPro Fusion Camera and the app to edit 360 content. Activities to share the 360 self portrait concept with visitors were developed including drawing cylindrical self-portraits which they could then wear on their heads for a 360 selfie. Students facilitated the Reframing Identity 360 workshop as part of ArtLab Movement at TEx. Using 360 cameras was a new experience and concept for our students and most people visiting the TEx. The 360 self-portraits were exhibited via live video stream from the 360 cameras on an iPad displayed at the Tate and let participants explore the views, which they could manipulate and distort to create the desired effect. Participants 360 self-portraits were also printed or sent to the visitors phone.

Impact

The impact of Reframing Identity 360 created access and inclusion with new technologies for students and the public. Experiencing the live video stream frequently gave visitors an ‘Oh Wow’ moment. TEx gave an opportunity for research led teaching with Dr Allen purpleSTARS, Alumni Kassie Headon and current BA students to explore the concept of 360 self-portraits gain professional practice experience facilitating the workshops and technical skills working, with the 360 camera. The 360 cameras are now part of the digital equipment available to students with a core team of ArtLab students now familiar with their potential and how to use them.

Reflections

Working with new technologies in collaboration with Alumni, ArtLab students and purpleSTARS led to new perspectives on ideas of inclusion and self -portraiture. The experimental research occurred in response to work at the Tate and in collaboration with visitors to TEx. The project built capacity and awareness of new technology being introduced into the Art Dept learning through research and practical experiences the potential to create artworks and inclusive engagements.

Follow up

Kassie Headen continued to work with the 360 camera collaborating with widening participation schools during the ArtLab summer workshops 2019 exploring spaces and manipulating 2d versions of 3d space.

We are developing further research collaborations and research led teaching opportunities for ideas exploring inclusion in museums and immersive virtual reality artworks/experiences using Oculus Rift technology.

Links and References

We created a 360 recording of our Reframing Identity event at the Tate https://www.thinglink.com/mediacard/1158753748827242499?autoplay=0&autorotate=0&displaytitle=1&rel=1

ArtLab documents the workshop

https://readingartlab.com/2019/04/25/artlab-tate-exchange-visual-diary-2nd-and-3rd-march-2019/

purpleSTARS web documentation

https://purplestars.org.uk/2017/11/12/purplestars-at-tate-gallery-2018/

Tate Exchange webpage

https://www.tate.org.uk/whats-on/tate-modern/tate-exchange/workshop/reading-assembly-movement

Improving assessment writing and grading skills through the use of a rubric – Dr Bolanle Adebola

Dr Bolanle Adebola is the Module Convenor and lecturer for the following modules on the LLM Programme (On campus and distance learning):

International Commercial Arbitration, Corporate Governance, and Corporate Finance. She is also a Lecturer for the LLB Research Placement Project.

Bolanle is also the Legal Practice Liaison Officer for the CCLFR.

A profile photo of Dr Adebola

OBJECTIVES

For students:

• To make the assessment criteria more transparent and understandable.
• To improve assessment output and essay writing skills generally.

For the teacher:

• To facilitate assessment grading by setting clearly defined criteria.
• To facilitate the feedback process by creating a framework for dialogue which is understood both by the teacher and the student.

CONTEXT

I faced a number of challenges in relation to the assessment process in my first year as a lecturer:

• My students had not performed as well as I would have liked them to in their assessments.

• It was my first time of having to justify the grades I had awarded and I found that I struggled to articulate clearly and consistently the reasons for some of the grades I had awarded.

• I had been newly introduced to the step-marking framework for distinction grades as well as the requirement to make full use of the grading scale which I found challenging in view of the quality of some of the essays I had graded.

I spoke to several colleagues but came to understand that there were as many approaches as there were people. I also discussed the assessment process with several of my students and came to understand that many were both unsure and unclear about the criteria by which their assessments were graded across their modules.
I concluded that I needed to build a bridge between my approach to assessment grading and my students’ understanding of the assessment criteria. Ideally, the chosen method would facilitate consistency and the provision of feedback on my part, and improve the quality of essays on my students’ part.

IMPLEMENTATION

I tend towards the constructivist approach to learning which means that I structure my activities towards promoting student-led learning. For summative assessments, my students are required to demonstrate their understanding and ability to critically appraise legal concepts that I have chosen from our sessions in class. Hence, the main output for all summative assessments on my modules is an essay. Wolf and Stevens (2007) assert that learning is best achieved where all the participants in the process are clear about the criteria for the performance and the levels at which it will be assessed. My goal therefore became to ensure that my students understood the elements I looked for in their essays; these being the criteria against which I graded the essays. They also had to understand how I decided the standards that their essays reflected. While the student handbook sets out the various standards that we apply in the University, I wanted to provide clearer direction on how they could meet or how I determine that an essay meets any of those standards.

If the students were to understand the criteria I apply when grading their essays, then I would have to articulate them. Articulating the criteria for a well-written essay would benefit both myself and my students. For my students, in addition to a clearer understanding of the assessment criteria, it would enable them to self-evaluate which would improve the quality of their output. Improved quality would lead to improved grades and I could give effect to university policy. Articulating the criteria would benefit me because it would facilitate consistency. It would also enable me to give detailed and helpful feedback to students on the strengths and weaknesses of the essays being graded, as well as on their essay writing skills in general; with advice on how to improve different facets of their outputs going forward. Ultimately, my students would learn valuable skills which they could apply across board and after they graduate.
For assessments which require some form of performance, essays being an example, a rubric is an excellent evaluation tool because it fulfils all the requirements I have expressed above. (Brookhart, 2013). Hence, I decided to present my grading criteria and standards in the form of a rubric.

The rubric is divided into 5 criteria which are set out in 5 rows:

  • Structure
  • Clarity
  • Research
  • Argument
  • Scholarship.

For each criterion, there are 4 performance levels which are set out in columns: Poor, Good, Merit and Excellent. An essay will be mapped along each row and column. The final marks will depend on how the student has performed on each criterion, as well as my perception of the output as a whole.

Studies suggest that a rubric is most effective when produced in collaboration with the students. (Andrade, Du and Mycek, 2010). When I created my rubric, I did not involve my students, however. I thought that would not be necessary given that my rubric was to be applied generally and with changing cohorts of students. Notwithstanding, I wanted students to engage with it. So, the document containing the rubric has an introduction addressed to the students, which explains the context in which the rubric has beencreated. It also explains how the rubric is applied and the relationship between the criteria. It states for example, that ‘even where the essay has good arguments, poor structure may undermine its score’. It explains that the final grade combines but objective assessment and a subjective evaluation of the output as a whole which is based on the marker’s discretion.

To ensure that students are not confused about the standards set out in the rubric and the assessment standards set out in the students’ handbook, the performance levels set out in the rubric are mapped against the assessment standards set out in the student handbook. The document containing the rubric also contains links to the relevant handbook. Finally, the rubric gives the students an example of how it would be applied to an assessment. Thereafter, it sets out the manner in which feedback would be presented to the students. That helps me create a structure in which feedback would be provided and which both the students and I would understand clearly.

IMPACT

My students’ assessment outputs have been of much better quality and so have achieved better grades since I introduced the rubric. In one of my modules, the average grade, as recorded in the module convenor’s report to the external examiner (MC’s Report), 2015/16, was 64.3%. 20% of the class attained distinctions, all in the 70-79 range. That year, I struggled to give feedback and was asked to provide additional feedback comments to a few students. In 2016/17, after I introduced the rubric, there was a slight dip in the average mark to 63.7%. The dip was because of a fail mark amongst the cohort. If that fail mark is controlled for, then the average percentage had crept up from 2015/16. There was a clear increase in the percentage of distinctions, which had gone up to
25.8% from 20% in the previous year. The cross-over had been

from the students who had been in the merit range. Clearly, some students had been able to use the rubric to improve the standards of their essays. I found the provision of feedback much easier in 2016/17 because I had clear direction from the rubric. When giving feedback I explained both the strengths and weaknesses of the essay in relation to each criterion. My hope was that they would apply the advice more generally across other modules as the method of assessment is the same across board. In 2017/18, the average mark for the same module went up to 68.84%. 38% of the class attained distinctions; with 3% attaining more than 80%. Hence, in my third year, I have also been able to utilise step-marking in the distinction grade which has enabled me to meet the university’s policy.

When I introduced the rubric in 2016/17, I had a control module, by which I mean a module in which I neither provided the rubric nor spoke to the students about their assessments in detail. The quality of assessments from that module was much lower than the others where the students had been introduced to the rubric. In that year, the average grade for the control module was 60%; with 20% attaining a distinction and 20% failing. In 2017/18, while I did not provide the students with the rubric, I spoke to them about the assessments. The average grade for the control module was 61.2%; with 23% attaining a distinction. There was a reduction in the failure rate to 7.6%. The distinction grade also expanded, with 7.6% attaining a higher distinction grade. There was movement both from the failure grade and the pass grade to the next standard/performance level. Though I did not provide the students with the rubric, I still provided feedback to the students using the rubric as a guide. I have found that it has become ingrained in me and is a very useful tool for explaining the reasons for my grades to my students.

From my experience, I can assert, justifiably, that the rubric has played a very important role in improving the students’ essay outputs. It has also enabled me to improve my feedback skills immensely.

REFLECTIONS

I have observed that as the studies in the field argue, it is insufficient merely to have a rubric. For the rubric to achieve the desired objectives, it is important that students actively engage with it. I must admit, that I did not take a genuinely constructivist approach to the rubric. I wanted to explain myself to the students. I did not really encourage a 2-way conversation as the studies encourage and I think this affected the effectiveness of the rubric.

In 2017/18, I decided to talk the students through the rubric, explaining how they can use it to improve performance. I led them through the rubric in the final or penultimate class. During the session, I explained how they might align their essays with the various performance levels/standards. I gave them insights into some of the essays I had assessed in the previous two years; highlighting which practices were poor and which were best. By the end of the autumn term, the first module in which I had both the rubric and an explanation of its application in class saw a huge improvement in student output as set out in the section above. The results have been the best I have ever had. As the standards have improved, so have the grades. As stated above, I have been able to achieve step-marking in the distinction grade while improving standards generally.

I have also noticed that even where a rubric is not used but the teacher talks to the students about the assessments and their expectations of them, students perform better than where there is no conversation at all. In 2017/18, while I did not provide the rubric to the control-module, I discussed the assessment with the students, explaining practices which they might find helpful. As demonstrated above, there was lower failure rate and improvement generally across board. I can conclude therefore that assessment criteria ought to be explained much better to students if their performance is to improve. However, I think that having a rubric and student engagement with it is the best option.

I have also noticed that many students tend to perform well; in the merit bracket. These students would like to improve but are unable to decipher how to do so. These students, in particular, find the rubric very helpful.

In addition, Wolf and Stevens (2007) observe that rubrics are particularly helpful for international students whose assessment systems may have been different, though no less valid, from that of the system in which they have presently chosen to study. Such students struggle to understand what is expected of them and so, may fail to attain the best standards/performance levels that they could for lack of understanding of the assessment practices. A large proportion of my students are international, and I think that they have benefitted from having the rubric; particularly when they are invited to engage with it actively.

Finally, the rubric has improved my feedback skills tremendously. I am able to express my observations and grades in terms well understood both by myself and my students. The provision of feedback is no longer a chore or a bore. It has actually become quite enjoyable for me.

FOLLOW UP

On publishing the rubric to students:

I know that blackboard gives the opportunity to embed a rubric within each module. I have only so far uploaded copies of my rubric onto blackboard for the students on each of my modules. I have decided to explore the blackboard option to make the annual upload of the rubric more efficient. I will also see if the blackboard offers opportunities to improve on the rubric which will be a couple of years old by the end of this academic year.

On the Implementation of the rubric:

I have noted, however, that it takes about half an hour to explain the rubric to students for each module which eats into valuable teaching time. A more efficient method is required to provide good assessment insight to students. This Summer, I will liaise with my colleagues, as the examination officer, to discuss the provision of a best practice session for our students in relation to their assessments. At the session, students will also be introduced to the rubric. The rubric can then be paired with actual illustrations which the students can be encouraged to grade using its content. Such sessions will improve their ability to self-evaluate which is crucial both to their learning and the improvement of their outputs.

LINKS

• K. Wolf and E. Stevens (2007) 7(1) Journal of Effective Teaching, 3. https://www.uncw.edu/jet/articles/vol7_1/Wolf.pdf
• H Andrade, Y Du and K Mycek, ‘Rubric-Referenced Self- Assessment and Middle School Students’ Writing’ (2010) 17(2) Assessment in Education: Principles, Policy &Practice, 199 https://www.tandfonline.com/doi/pdf/10.1080/09695941003 696172?needAccess=true
• S Brookhart, How to Create and Use Rubrics for Formative Assessment and Grading (Association for Supervision & Curriculum Development, ASCD, VA, 2013).
• Turnitin, ‘Rubrics and Grading Forms’ https://guides.turnitin.com/01_Manuals_and_Guides/Instru ctor_Guides/Turnitin_Classic_(Deprecated)/25_GradeMark
/Rubrics_and_Grading_Forms
• Blackboard, ‘Grade with Rubrics’ https://help.blackboard.com/Learn/Instructor/Grade/Rubrics
/Grade_with_Rubrics
• Blackboard, ‘Import and Export Rubrics’ https://help.blackboard.com/Learn/Instructor/Grade/Rubrics
/Import_and_Export_Rubrics

Using personal capture to support students to learn practical theory outside of the laboratory

Dr Geraldine (Jay) Mulley – School of Biological Sciences  

Overview

I produced four screen casts to encourage students to better prepare for practical classes and to reinforce practical theory taught in class. Approximately 45% of the cohort watched at least some of the video content, mainly in the few days leading up to the practical assessment. The students appreciated the extra resources, and there was a noticeable improvement in module satisfaction scores.

Objectives

  • To provide consistency in delivery of teaching practical theory between groups led by different practical leaders
  • To provide students with engaging resources to use outside of the classroom, to use as preparation tools for practical classes and as revision aids for the Blackboard-­‐based practical assessment

Context

The Part 1 Bacteriology & Virology module includes 12 hours of practical classes designed to teach students key microbiological techniques and theory. I usually begin each practical with a short lecture-­‐style introduction to explain what they need to do and why.  The 3 hr classes are typically very busy, and I have observed that some students feel overwhelmed with “information overload” and find it hard to assimilate the theory, whilst learning the new techniques.  I have had to schedule multiple runs of practical classes to accommodate the large cohort and my colleagues now teach some of the repeat sessions. My aim was to create a series of videos to explain the theoretical background in more detail that students can access outside of the classroom. I hoped this would ensure consistency in what is taught to each group and give the students more time to focus on learning the techniques during the classes. I hoped that they would use the resources both to help prepare for the classes and as a revision aid for the practical assessment

Implementation

I initially tried to record 4 videos by simply recording myself talking through my original PowerPoint presentations that I use in the practical class introductions (i.e. 4 individual videos to cover each of the 4 practical classes). Having started to make the videos, I realised that it was very difficult for me to explain the theory in this format, which was quite surprising given this is how I had been delivering the information up until that point! I therefore adapted the PowerPoint presentations to make videos focusing on each of the experimental themes, talking through what the students will do in the lab week-­‐by-­‐week with an explanation of the theory at appropriate points. I recorded the video tutorials using the Mediasite “slideshow + audio” option and narrated free-­‐style as I would do in a lecture (no script).  When I made a mistake, I paused for a few seconds and then started the sentence again. After finishing the entire recording, I then used the editing feature to cut out the mistakes, which were easy to identify in the audio trace due to the long pauses. I was also able to move slides to the appropriate place if I had poorly timed the slide transitions. Editing each video took around 30 min to 1 hr. I found it relatively easy to record and edit the videos and I became much more efficient after I had recorded the first few videos.

I would have liked to have asked students and other staff to help in the design and production of the videos, but the timing of the Pilot was not conducive to being able to collaborate at the time.

Impact

Mediasite analytics show 45% of the students in the cohort viewed at least some of the resources, and 17% of the cohort viewed each video more than once. Students watched the three shorter videos (3 – 4 min) in their entirety, but the longest video (18 min) showed a drop-­‐off in the number of views after approx. 5 min (Figure 1), and so in future I will limit my videos to 5 min max.

Graph showing how students watched the video

Only a few students viewed videos prior to practical classes; almost all views were in the few days leading up to the practical assessment on Blackboard. This shows that students were using the videos as a revision aid rather than as a preparation tool. This is probably because I uploaded the videos midway through term and by this stage one of the three groups had already completed the 4 practical classes and so I did not want to disadvantage this group by promoting the videos as a preparation tool. It will be interesting if I can encourage students to use it for this purpose next academic year. My expectation was that time spent viewing would directly correlate with practical assessment grades, however there is not a clear linear correlation (Figure 2).

Graph showing use of videos and grades obtained

For some students attending the practical classes and reading the handbook is enough to achieve a good grade. However, students that spent time viewing the videos did get a higher average than those that did not view any (Figure 3), although this probably reflects overall engagement with all the available learning resources.  Responses to the student survey indicated that students felt the videos improved their understanding of the topic and supported them to revise what they had learnt in class at their own pace.

Graph showing video watching and grades obtained

Reflections

The biggest challenge I faced was trying to recruit other colleagues to the pilot during a very busy Autumn term and finding the time to design the videos myself. It would have been helpful to see some examples of how to use personal capture before I started but having participated in the Pilot, I now have more confidence. Once I had experimented with the Mediasite software, I found it quite easy to record the videos and publish to my Blackboard site (with guidance from the excellent support from the TEL team and Blackboard help web pages). I liked the editing tools, although I would very much like the ability to cut and paste different videos together.  The analytics are very useful and much better than the “track users” function in Blackboard. The analytics reinforced the suggestion that students are much more likely to finish watching short videos and I would advise making videos 5 min maximum, ideally 3 min, in length.    My experience of personal capture was incredibly positive, and I will certainly be making more resources for my students for all my modules.

Follow-up

Since making the recordings for the Pilot, I have teamed up with several colleagues in the School of Biological Sciences and will show them how to use Mediasite so that they can make resources for their modules over summer. I have also used the Mediasite software to record microscope training sessions and talks from open days.

Building bridges and smoothing edges

Patrick Finnegan – School of Economics, Politics & International Relations

Overview

My use of the personal capture scheme was intended to enhance our teaching methods within the department. My initial aims of building additional video capture material into the ongoing lecture series did not come through but I was able to use the capture package to engage my students more in the administration of a (then) overly complicated module.

Objectives

  • Initial plan centred on including personal capture on the Army Higher Education Pathway project – this was not possible due to software incompatibility with the Canvas platform used for the project
  • New objectives were based on a different module (The Study of Politics) and improving the student experience on that module
  • Improve the explanation of methods
  • Explain the supervisory choice system
  • Enhance lectures on complicated topics

Context

The module I focused on was Po2SOP (The Study of Politics) with 160 students. Personal capture was needed on this project as it allowed myself, as convenor of our largest module, to communicate with all of my students in a more engaging way. We needed a way to bring the topic to life and ensure that the students took on board the lessons we needed them to. I wanted to include real examples of the methods in action and to use the screen casts to explain certain decisions that would be too difficult to do via email.

Implementation

Unfortunately, the project began too late in the term to really affect the lectures on this module, which is co-taught between several staff members often using pre-existing slides. However, I was able to use it to engage in discussion with students to explain issues such as supervisor reallocation during the year and how our special event – the mini-conference – was to work. Rather than writing lengthy emails, I was able to quickly and visually explain to he students what was happening and to invite their responses, which some did. They did not engage with the capture material so to speak but my use of it did encourage discussion as to how they would like to see it used in future and how they would like to receive feedback on assessments in future if audio/visual options were available. The recordings made by myself and my colleague were mainly PowerPoint voice-overs or were direct to camera discussions. This allowed us to present the students with illustrations and ‘first hand’ information. These required significant editing to make sure they were suitable but the final product was satisfactory.

Impact

Beyond ‘ease of life’ effects this year, there was not a great deal of impact but this was expected given the start date (the largest number of views in a video was 86, but this was an admin explanation style video). However, planning for next year has already incorporated the different potential advantages provided by personal capture. For example, the same methods module will now incorporate tutorial videos made within the department and will maintain some supervisor ‘adverts’ to allow students to better choose which member of staff they will seek to work with in future. Within other modules, some staff members will be taking the opportunity to build in some flipped classroom style teaching and other time-heavy elements that were not previously available to them.

Reflections

Time needed to organise and direct co-pilots within a teaching-heavy department needed to be a lot greater than I originally planned. I was also not expecting to meet the levels of resistance that I did from some more established staff who were not interested in changing how they delivered the material they had prepared earlier. The major difference I would include going forward would be to focus on upcoming modules rather than pre-existing as incorporating the material when the module has already started was too difficult.

Follow-up

I have started to prepare some videos on material I know will be needed in the future, this is relatively straight forward to do and will mimic the general practice to date. The main evolution will be seen in responses to student need during class and how screen casts can be made on demand and with consistent quality.

Creating screencast videos to support and engage post-graduate students

Sue Blackett – Henley Business School, 2018-19

Image of Sue Blackett

Overview

I participated in the university’s Personal Capture pilot as a Champion for my school to trial the Mediasite tool to create screen cast videos for use in teaching and learning. My aim was to help PGT students get to grips with key elements of the module. The videos facilitated students in repeatedly viewing content with the aim of increasing engagement with the module. Some videos were watched multiple times at different points throughout the term indicating that information needed to be refreshed. 

Objectives

  1. To connect with the cohort and establish module expectations. 
  2. Reduce class time taken up with module administration. 
  3. Provide coursework feedback in an alternative form and reinforce its feedforward use for the final exam. 
  4. To provide exam revision advice and highlight areas of focus. 
  5. Support students with weaker English language skills. 
  6. Provide module materials in a reusable, accessible and alternative form. 

Context

The target audience was students on ACM003 Management Accounting Theory & Practice, a postgraduate course where 91% of students were native Mandarin speakers. English language skills were an issue for some students, so capture video provided opportunities for students to re-watch and get to grips with the content at their leisure. In addition, I wanted to free up class contact time so I could focus on content in areas that had been more challenging on the previous run of the module. Also, by using different colours and font sizes on the PowerPoint slides, the visual emphasis of key points reinforced the accompanying audio. 

Implementation

The first video recorded was a welcome to the module video (slides and audio only) that covered the module administration i.e. an overview of module, outline of assessment, key dates, module text book etc. The content for the video was relatively straightforward as it was taken out of the first lecture’s slides. By isolating module admin information, more information could be added e.g. mapping assessable learning outcomes to assessments and explaining the purpose of each type of assessment. In first recording the video, I did not follow a script as I was trying to make my delivery sound more natural. Instead, I made short notes on slides that needed extra information and printed off the presentation as slides with notes. As this is the same strategy that I use to deliver lectures, I was less concerned about being “audio ready” i.e. not making errors in my voice recording. 

 In the second and third videos (coursework feedback and exam revision advice), I included video of myself delivering the presentations. As the recordings were made in my home office, additional visual matters had to be considered. These included: what I was wearing, the background behind me, looking into the camera, turning pages, etc. The second attempts of each recording were much more fluent and therefore uploaded to Blackboard. 

 The last two recordings were quite different in nature. The coursework feedback used visuals of bar charts and tables to communicate statistics accompanied by audio that focused on qualitative feedback. The exam revision video used lots narrative bullet points. 

Examples of my videos:

Welcome to module: https://uor.mediasite.com/Mediasite/Play/7a7f676595c84507aa31aafe994f2f071d

Assessed coursework feedback: https://uor.mediasite.com/Mediasite/Play/077e974725f44cc8b0debd6361aaaba71d

Exam revision advice: https://uor.mediasite.com/Mediasite/Play/94e4156753c848dbafc3b5e75a9c3d441d

Resit Exam Advice: https://uor.mediasite.com/Mediasite/Play/e8b88b44a7724c5aa4ef8def412c22fd1d

Impact

The welcome video did have impact as it was the only source of information about the administration for the course. When students arrived at the first class with the text book, this indicated that they had been able to access the information they needed to prepare for the course. Student response to the personal capture pilot project questionnaire was low (18%), however, the general feedback was that the videos were useful in supporting them during the course. 

 Analysis of analytics via MediaSite and Blackboard provided some very interesting insights: 

  1. Most students did not watch the videos as soon as they were released. 
  2. Some of the videos were watched multiple times throughout the term by weaker and stronger students. 
  3. Some students were not recorded as having accessed the videos. 
  4. Students were focused for the first 20 – 60 seconds of each video and then skipped through the videos. 
  5. Few students watched the videos from start to finish i.e. the average time watched for the 4 min 49 secs welcome video was 2 min 10 secs. The coursework feedback video was 9 mins 21 secs, however, average viewing time was 3 mins 11 secs. The revision video followed the same trend being 8 mins 41 secs long with an average watching time of 2 mins 55 secs.
     

Review of video along with watching trends showed that students skipped through the videos to the points where slides changed. This suggested that the majority were reading the slides rather than listening to the accompanying commentary which contained supplementary information. 

 As no student failed to meet the admin expectations of the course, those that had not watched the video must have been informed by those who had. 

Reflections

The analytics were most illuminating. Me appearing in videos was supposed to establish bonds with the cohort and increase engagement, however, my appearance seemed to be irrelevant as the students were focused on reading rather than listening. This could have been due to weaker listening skills but also highlights that students might think that all important information is written down rather than spoken.  

 Videos with graphics were more watched than those without so my challenge will be to think about what content I include in slides i.e. more graphics with fewer words and/or narrative slides with no audio. 

 I will continue with capture videos, however, I will do more to test their effectiveness, for example I will design in-class quizzes using KahootMentimeter, etc. to test whether the content of the videos has been internalised. 

Follow-up

I’ve become much quicker at designing the PowerPoint content and less worried about stumbling or searching for the right words to use. I have been able to edit videos more quickly e.g. cutting out excessive time, cropping the end of the video. Embedding videos in Blackboard has also become easier the more I’ve done  it. The support information was good, however, I faced  a multitude of problems that IT Support had to help me with, which, if I’m honest, was putting me off using the tool  (I’m a Mac user mostly using this tool off campus).  

 

Using Flipped Learning to Meet the Challenges of Large Group Lectures

Adopting a flipped classroom approach to meet the challenges of large group lectures

Name/School/ Email address

Amanda Millmore / School of Law / a.millmore@reading.ac.uk

Overview

Faced with double-teaching a cohort of 480 students (plus an additional 30 in University of Reading Malaysia), I was concerned to ensure that students in each lecture group had a similar teaching experience. My solution was to “flip” some of the learning, by recording short video lectures covering content that I would otherwise have lectured live and to use the time freed up to slow the pace and instigate active learning within the lectures. Students provided overwhelmingly positive feedback in formal and informal module evaluations, the introduction of flipped learning has aided the welfare of students, allowing those who are absent or who have disabilities or language barriers to revisit material as and when needed. For staff, it has aided the reduction in my workload and has the ongoing benefit of reducing workload of colleagues who have taken over teaching the module.

Objectives

  • Record short video lectures to supplement live lectures.
  • Use the time freed up by the removal of content no longer delivered live to introduce active learning techniques within the lectures.
  • Support the students in their problem-solving skills (tested in the end of year examination).

Context

The module “General Introduction to Law” is a “lecture only” first year undergraduate module, which is mandatory for many non-law students, covering unfamiliar legal concepts. Whilst I have previously tried to introduce some active learning into these lectures, I have struggled with time constraints due to the sheer volume of compulsory material to be covered.

Student feedback requested more support in tackling legal problem questions, I wanted to assist students and needed to free up some space within the lectures to do this and “flipping” some of the content by creating videos seemed to offer a solution.

As many academics (Berrett, 2012; Schaffzin, 2016) have noted, there is more to flipping than merely moving lectures online, it is about a change of pedagogical approach.

Implementation

I sought initial support from the TEL (Technology Enhanced Learning) team, who were very happy to give advice about technology options. I selected the free Screencast-O-Matic software, which was simple to use with minimal equipment (a headset with microphone plugged into my computer).

I recorded 8 short videos, which were screencasts of some of my lecture slides with my narration; 6 were traditional lecture content and 2 were problem solving advice and modelling an exemplar problem question and answer (which I had previously offered as straightforward read-only documents on Blackboard).

The software that I used restricted me to 15 minute videos, which worked well for maintaining student attention. My screencast videos were embedded within the Blackboard module and could also be viewed directly on the internet https://screencast-o-matic.com/u/iIMC/AmandaMillmoreGeneralIntroductiontoLaw.

I reminded students to watch the videos via email and during the lectures, and I was able to track the number of views of each video, which enabled me to prompt students if levels of viewing were lower than I expected.

By moving some of the content delivery online I was also able to incorporate more problem-solving tasks into the live lectures. I was able to slow the pace and to invite dialogue, often by using technology enhanced learning. For example, I devoted an hour to tackling an exam-style problem, with students actively working to solve the problem using the knowledge gained via the flipped learning videos and previous live lectures. I used the applications Mentimeter, Socrative and Kahoot to interact with the students, asking them multiple-choice questions, encouraging them to vote on questions and to create word clouds of their initial thoughts on tackling problem questions as we progressed.

Evaluation

I evaluated reaction to the module using the usual formal and informal module evaluations. I also tracked engagement with the videos and actively used these figures to prompt students if views were lower than expected. I monitored attendance to modules and didn’t notice any drop-off in attendance. Finally, I reviewed end of year results to assess impact on students results.

Impact

Student feedback, about the videos and problem solving, was overwhelmingly positive in both formal and informal module evaluations.

Videos can be of assistance if a student is absent, has a disability or wishes to revisit the material. Sankoff (2014) and Billings-Gagliardi and Mazor (2007) dismiss concerns about reduced student attendance due to online material, and this was borne out by my experience, with no noticeable drop-off in numbers attending lectures; I interpret this as a positive sign of student satisfaction. The videos worked to supplement the live lectures rather than replace them.

There is a clear, positive impact on my own workload and that of my colleagues. Whilst I am no longer teaching on this module, my successor has been able to use my videos again in her teaching, thereby reducing her own workload. I have also been able to re-use some of the videos in other modules.

Reflections

Whilst flipped learning is intensive to plan, create and execute, the ability to re-use the videos in multiple modules is a huge advantage; short videos are simple to re-record if, and when, updating is required.

My initial concern that students would not watch the videos was utterly misplaced. Each video has had in excess of 1200 views (and one video has exceeded 2500). Some of the material was only covered by the flipped learning videos, and still appeared within the examination; students who tackled those questions did equally well as those answering questions covering content which was given via live lecture, but those questions were less popular (2017/18 examination).

I was conscious that there may be some students who would just ignore the videos, thereby missing out on chunks of the syllabus, I tried to mitigate this by running quizzes during lectures on the recorded material, and offering banks of multiple choice questions (MCQs) on Blackboard for students to test their knowledge (aligned to the summative examination which included a multiple choice section). In addition, I clearly signposted the importance of the video recorded material by email, on the Blackboard page and orally and emphasised that it would form part of the final examination and could not be ignored.

My experience echoes that of Schaffzin’s study (2016) monitoring impact, which showed no statistical significance in law results having instituted flipped learning, although she felt that it was a more positive teaching method. Examination results for the module in the end of year summative assessment (100% examination) were broadly consistent with the results in previous academic years, but student satisfaction was higher, with positive feedback about the use of videos and active learning activities.

Follow Up

Since creating the flipped learning videos another colleague has taken over as convenor and continued to use the videos I created. Some of the videos have also been able to be used in other modules.  I have used screencast videos in another non-law module, and also used them as introductory material for a large core Part 1 Law module. Student feedback in module evaluations praised the additional material. One evolution in another module was that when I ran out of time to cover working through a past exam question within a lecture, I created a quick screencast which finished off the topic for students; I felt that it was better to go at a more sensible pace in the lecture and use the screencast rather than rush through the material.

Michelle Johnson, Module Convenor 2018-2019 commented that:

“I have continued to use and expand the flipped learning initiative as part of the module and have incorporated further screencasts into the module in relation to the contract law content delivered. This allowed for additional time on the module to conduct a peer-assessment exercise focussed on increasing the students’ direct familiarity with exam questions and also crucially the marking criteria that would be used to score their Summer exams. Students continue to be very positive about the incorporation of flipped learning material on the module and I feel strongly that it allowed the students to review the more basic introductory content prior to lectures, this allowing time for a deeper engagement with the more challenging aspects of the lectures during lecture time. This seemed to improve students understanding of the topics more broadly, allowing them to revisit material whenever they needed and in a more targeted way than a simple lecture recording.”

TEF

TQ1, LE1, SO3

Links

University of Reading TEL advice about personal capture – https://sites.reading.ac.uk/tel-support/category/learning-capture/personal-capture

Berrett, D. (2012). How “Flipping” the Classroom Can Improve the Traditional Lecture. – https://www.chronicle.com/article/how-flipping-the-classroom/130857. Chronicle of Higher Education..

Billings-Gagliardi, S and Mazor, K. (2007) Student decisions about lecture attendance: do electronic course materials matter?. Academic Medicine: Journal of the Association of American Medical Colleges, 82(10), S73-S76.

Sankoff, P. (2014) Taking the Instruction of Law outside the Lecture Hall: How the Flipped Classroom Can Make Learning More Productive and Enjoyable (for Professors and Students), 51, Alberta Law Review, pp.891-906.

Schaffzin, K. (2016) Learning Outcomes in a Flipped Classroom: A comparison of Civil Procedure II Test Scores between Students in a Traditional Class and a Flipped Class, University of Memphis Law Review, 46, pp. 661.

Making full use of grademark in geography and environmental science – Professor Andrew Wade

 

Profile picture for Prof. Andrew Wade

Professor Andrew Wade is responsible for research in hydrology, focused on water pollution, and Undergraduate and Postgraduate Teaching, including Hydrological Processes

OBJECTIVES

Colleagues within the School of Archaeology, Geography and Environmental Sciences (SAGES) have been aware of the University’s broader ambition to move towards online submission, feedback and grading where possible. Many had already made the change from paper based to online practices and others felt that they would like the opportunity to explore new ways of providing marks and feedback to see if handling the process online led to a better experience for both staff and students.

CONTEXT

In Summer 2017 it was agreed that SAGES would become one of the Early Adopter Schools working with the EMA Programme. This meant that the e Submission, Feedback and Grading work stream within the Programme worked very closely with both academic and professional colleagues within the School from June 2017 onwards. This was in order to support all aspects of a change from offline to online marking and broader processes for all coursework except where there was a clear practical reason not to, for example, field note-books.
I had started marking online in 2016-2017 so was familiar with some aspects of marking tools and some of the broader processes.

IMPLEMENTATION

My Part 2 module, GV2HY Hydrological Processes, involves students producing a report containing two sections. Part A focuses on a series of short answers based on practical-class experiences and Part B requires students to write a short essay. I was keen to use all of the functionality of Grademark/Turnitin during the marking process so I spent time creating my own personalised QuickMark bank so that I could simply pull across commonly used feedback phrases and marks against each specific question. This function was particularly useful to use when marking Part A. I could pull across QuickMarks showing the mark and then, in the same comment, explain why the question received, for example, 2 out of a possible 4 marks. It was especially helpful that my School sent around a discipline specific set of QuickMarks created by a colleagues. We could then pull the whole set or just particular QuickMarks into our own personalised set if we wanted to. This reduced the time spend on personalising and meant that the quality of my own set was improved further.

I also wanted to explore the usefulness of rubric grids as one way to provide feedback on the essay content in Part B of the assignment. A discipline specific example rubric grid was created by the School and send around to colleagues as a starting point. We could then amend this rubric to fit our specific assessment or, more generally, our modules and programmes. The personalised rubrics were attached to assignments using a simple process led by administrative colleagues. When marking I would highlight the level of performance achieved by each student, against each criteria by simply highlighting the box in blue. This rubric grid was used alongside both QuickMarks and in text comments in the essay. More specific comments were given in the blank free text box to the right of the screen.

IMPACT

Unfortunately module evaluation questionnaires were distributed and completed before students received feedback on their assignments so the student reaction to online feedback using QuickMarks, in text comments, free text comments and rubrics was not captured.

In terms of the impact on the marker experience, after spending some initial time getting my personal Quickmarks library right and amending the rubric example to fit with my module, I found marking online easier and quicker than marking on paper.

In addition to this, I also found that the use of rubrics helped to ensure standardisation. I felt comfortable that my students were receiving similar amounts of feedback and that this feedback was consistent across the cohort and when returning to marking the coursework after a break. When moderating coursework, I tend to find more consistent marking when colleagues have used a rubric.
I also felt that students received more feedback than they usually might but am conscious of the risk that they that drown in the detail. I try to use the free text boxes to provide a useful overall summary to avoid overuse of QuickMarks.

I don’t worry now about carrying large amounts of paper around or securing the work when I take assignments home. I also don’t need to worry about whether the work I’m marking has been submitted after the deadline – under the new processes established in SAGES, Support Centre colleagues deduct marks for late submission.

I do tend to provide my cohorts with a short piece of generic feedback, including an indicator of how the group performed-showing the percentage of students who had attained a mark in each class. I could easily access this information from Grademark/Turnitin.

I’m also still able to work through the feedback received by my Personal Tutees. I arrange individual sessions with them, they access ‘My Grades’ on Blackboard during this meeting and we work through the feedback together.

One issue was that, because the setting were set up in a particular way, students could access their feedback as soon as we had finished writing it. This issue was identified quickly and the settings were changed.

REFLECTIONS

My use of online marking has been successful and straightforward but my experience has been helped very significantly by the availability of two screens in my office. These had already been provided by School but became absolutely essential. Although I largely mark in my office on campus, when I mark from home I set up two laptops next to each other to replicate having two screens. This set up allows me to be able to check the student’s work on one screen whilst keeping their coursework on the other.

One further area of note is that the process of actually creating a rubric prompted a degree of reflection over what we actually want to see from students against each criteria and at different levels. This was particularly true around the grade classification boundaries-what is the different between a high 2:2 and a low 2:1 in terms of each of the criteria we mark against and how can we describe these differences in the descriptor boxes in a rubric grid so that students can understand.

This process of trying to make full use of all of the functions within our marking tools has led to some reflection surrounding criteria, what we want to see and how we might describe this to students.

LINKS

For more information on the creation and use of rubrics within Grademark/Turnitin please see the Technology Enhanced Learning Blog pages here:
http://blogs.reading.ac.uk/tel/support-blackboard/blackboard-support- staff-assessment/blackboard-support-staff-turnitin/turnitin-rubrics/

Introducing online assessment in IFP modules – Dr Dawn Clarke

OBJECTIVES

Colleagues within the IFP wanted to improve the student assessment experience. In particular we wanted to make the end to end process quicker and easier and reduce printing costs for students. We also wanted to offer some consistency with undergraduate programmes. This was particularly important for those students who stay in Reading after their foundation year to undertake an undergraduate degree. We were also keen to discover if there would be any additional benefits or challenges which we had not anticipated.

CONTEXT

No IFP modules had adopted online submission, grading and feedback until Spring 2015. We were aware of a number of departments successfully running online assessment within the University and the broader move towards electronic management of assessment within the sector as a whole. We introduced online assessment for all written assignments, including work containing pictures and diagrams, onto the IFP module ‘Politics’ (PO0POL) and ‘Sociology’ (PO0SOC) in 2015.

IMPLEMENTATION

We made the decision very early in the process that we would use Turnitin Grademark within Blackboard Gradecenter. This was consistent with existing use in the Department of Politics.
We created a set of bespoke instructions for students to follow when submitting their work and when viewing their feedback. These instructions were based on those provided by the Technology Enhanced Learning Team but adjusted to fit our specific audience. These were distributed in hard copy and we spent some time in class reviewing the
process well before the first submission date.

Submission areas in Blackboard and standard feedback rubric sections were created by the Departmental Administrator who was already highly experienced.

IMPACT

Overall the end to end assessment process did become easier for students. They didn’t have to travel to campus to submit their assignments and they enjoyed instant access to Turnitin.
Turnitin itself became a very useful learning tool for pre degree foundation students. It not only provided initial feedback on their work but prompted a dialogue with the marker before work was finally submitted. For students right at the start of their university experience this was extremely useful.

It was equally useful to automate deadlines. Students very clearly understood the exact time of the deadline. The marker was external to this process allowing them to adopt a more neutral position. This was more transparent than manual systems and ensured a visibly consistent experience for all students.

In addition to this, because students did not have to print out their assignments, they became much more likely to include pictures and diagrams to illustrate their work. This often improved the quality of submission.

All students uploaded their essays without any additional help. A small number also wanted to upload their own PowerPoint presentations of their in class presentations at the same time which meant that we needed to work through the difficulty of uploading two files under one submission point.

Moving to online assessment presented a number of further challenges. In particular, we became aware that not all students were accessing their feedback. Arranging online access for external examiners in order to moderate the work presented a final challenge. We then worked to address both of these issues.

REFLECTIONS

It would be really helpful to explore the student experience in more depth. One way to do this would be to include a section specifically focused on feedback within IFP module evaluation forms.
In the future we would like to make use of the audio feedback tool within Gradecenter. This will maximise the experience of international
students and their chances of developing language skills.

Using Quickmarks to enhance essay feedback in the department of English Literature – Dr Mary Morrissey

Within the department, I teach primarily in Early Modern and Old English. For more details of my teaching please see Mary Morrissey Teaching and Convening

My primary research subject is Reformation literature, particularly from London. I am particularly interested in Paul’s Cross, the most important public pulpit in sixteenth and seventeenth-century England. I retain an interested in early modern women writers, with a particular focus on women writers’ use of theological arguments. Further details of my research activities can be found at Mary Morrissey Research

OBJECTIVES

A number of modules within the Department of English Literature began using GradeMark as a new marking tool in the Autumn of 2015. I wanted to explore the use of the new QuickMarks function as a way of enhancing the quality of the feedback provided to our students and ensuring the ‘feedback loop’ from general advice on essay writing to the feedback on particular pieces of assessed work was completed.

CONTEXT

The Department developed extensive guidance on writing skills to support student assessment: this includes advice on structuring an argument as well as guidance on grammar and citations. This guide was housed on departmental handbooks and in the assignments folder in Blackboard. There was considerable concern that this resource was underused by students. We did know that the QuickMarks function was being used as part of our online feedback provision and that it was possible to personalise the comments we were using and to add links to those comments as a way of providing additional explanation to students.

IMPLEMENTATION

In order to allow relevant sections of the essay writing style guide to be accessed via QuickMarks I copied the document into a Google Doc, divided each section by using Google Doc bookmarks and assigned each bookmark an individual URL link. I then used Bitly.com to shorten the URL link assigned to each section by the Google Doc to make it more useable. I then created a set of Quickmarks that included these links to the Style Guide. In this way, students had direct access to the relevant section of the Guide while reading their feedback. So if a student hadn’t adopted the correct referencing format (the Modern Humanities Research Association style in the case of English Literature) the marker would pull a QuickMark across to the relevant point of the essay. When the student hovered over this comment bubble, they would see the text within it but were also able to click on the URL taking them directly to page 7 of the departmental writing style guide on MHRA citation and referencing. If other colleagues wanted to start adopting the same approach, I simply exported the QuickMark set to them which they incorporated into their own QuickMarks bank within seconds.

IMPACT

The Bitly.com tool, used to shorten the URL link, monitored the usage of each link included in our QuickMarks. This showed us how many times and on which date each individual link was used.

To complement this data I also ran a survey on the student response to online marking and feedback. 35 undergraduate students responded. This showed that students found feedback most useful when it came in forms that were familiar from paper marking, like general comments on the essay and marginal comments throughout the essay. Less familiar types of feedback (links to web-resources included in bubble comments accessed by hovering the cursor) were often missed. In the survey, 28 out of 35 students said that they did not receive any links to the writing style guide within their QuickMark comments even though more than this did receive them. 3 students did not click on the links. Of the 5 remaining students who did make use of the links, 3 responded positively, mentioning their value in terms of improving their writing skills:

“It was good to refer to alongside my work”
“They helped me to strengthen my writing overall”
“Yes motivational to actually look at them-whereas on a paper copy you might read he comment and forget but here you can click straight through so much easier!”

REFLECTIONS

Some of the new functions available to us on GradeMark allow us to improve our feedback. We shouldn’t just be using online marking tools to replicate existing off line marking processes. We can go much further! But if this is going to be successful it is really important to inform students about the range of options that online marking makes available so that they make the most of the systems we use.

Once we do this effectively, we can then explore other options. In English Literature, we are keen to ensure that our Department style guide is used effectively. But there are many other web resources to which we could link through Quickmarks: screencast essay writing guides in Politics and IWLP, as well as the new Academic Integrity toolkit by Study Advice, for example.

By including links within QuickMark comments we help to move students towards greater levels of assessment literacy.

LINKS

Academic Integrity Toolkit

http://libguides.reading.ac.uk/academicintegrity

Examples of assessment support screencasts created by colleagues

Screencast bank

Study Support Screencast Suite

https://www.reading.ac.uk/library/study-advice/guides/lib-sa- videos.aspx

Bitly URL shortener and link management platform

https://bitly.com/