Adrian Aronsson-Storrier – School of Law
Watch Adrian’s 5 minute video case study about how he used personal capture (the Mediasite tool) to record a lecture cancelled due to bad weather during the personal capture pilot project, 2018-19.
Adrian Aronsson-Storrier – School of Law
Watch Adrian’s 5 minute video case study about how he used personal capture (the Mediasite tool) to record a lecture cancelled due to bad weather during the personal capture pilot project, 2018-19.
Sue Blackett – Henley Business School, 2018-19
I participated in the university’s Personal Capture pilot as a Champion for my school to trial the Mediasite tool to create screen cast videos for use in teaching and learning. My aim was to help PGT students get to grips with key elements of the module. The videos facilitated students in repeatedly viewing content with the aim of increasing engagement with the module. Some videos were watched multiple times at different points throughout the term indicating that information needed to be refreshed.
The target audience was students on ACM003 Management Accounting Theory & Practice, a postgraduate course where 91% of students were native Mandarin speakers. English language skills were an issue for some students, so capture video provided opportunities for students to re-watch and get to grips with the content at their leisure. In addition, I wanted to free up class contact time so I could focus on content in areas that had been more challenging on the previous run of the module. Also, by using different colours and font sizes on the PowerPoint slides, the visual emphasis of key points reinforced the accompanying audio.
The first video recorded was a welcome to the module video (slides and audio only) that covered the module administration i.e. an overview of module, outline of assessment, key dates, module text book etc. The content for the video was relatively straightforward as it was taken out of the first lecture’s slides. By isolating module admin information, more information could be added e.g. mapping assessable learning outcomes to assessments and explaining the purpose of each type of assessment. In first recording the video, I did not follow a script as I was trying to make my delivery sound more natural. Instead, I made short notes on slides that needed extra information and printed off the presentation as slides with notes. As this is the same strategy that I use to deliver lectures, I was less concerned about being “audio ready” i.e. not making errors in my voice recording.
In the second and third videos (coursework feedback and exam revision advice), I included video of myself delivering the presentations. As the recordings were made in my home office, additional visual matters had to be considered. These included: what I was wearing, the background behind me, looking into the camera, turning pages, etc. The second attempts of each recording were much more fluent and therefore uploaded to Blackboard.
The last two recordings were quite different in nature. The coursework feedback used visuals of bar charts and tables to communicate statistics accompanied by audio that focused on qualitative feedback. The exam revision video used lots narrative bullet points.
Examples of my videos:
Welcome to module: https://uor.mediasite.com/Mediasite/Play/7a7f676595c84507aa31aafe994f2f071d
Assessed coursework feedback: https://uor.mediasite.com/Mediasite/Play/077e974725f44cc8b0debd6361aaaba71d
Exam revision advice: https://uor.mediasite.com/Mediasite/Play/94e4156753c848dbafc3b5e75a9c3d441d
Resit Exam Advice: https://uor.mediasite.com/Mediasite/Play/e8b88b44a7724c5aa4ef8def412c22fd1d
The welcome video did have impact as it was the only source of information about the administration for the course. When students arrived at the first class with the text book, this indicated that they had been able to access the information they needed to prepare for the course. Student response to the personal capture pilot project questionnaire was low (18%), however, the general feedback was that the videos were useful in supporting them during the course.
Analysis of analytics via MediaSite and Blackboard provided some very interesting insights:
Review of video along with watching trends showed that students skipped through the videos to the points where slides changed. This suggested that the majority were reading the slides rather than listening to the accompanying commentary which contained supplementary information.
As no student failed to meet the admin expectations of the course, those that had not watched the video must have been informed by those who had.
The analytics were most illuminating. Me appearing in videos was supposed to establish bonds with the cohort and increase engagement, however, my appearance seemed to be irrelevant as the students were focused on reading rather than listening. This could have been due to weaker listening skills but also highlights that students might think that all important information is written down rather than spoken.
Videos with graphics were more watched than those without so my challenge will be to think about what content I include in slides i.e. more graphics with fewer words and/or narrative slides with no audio.
I will continue with capture videos, however, I will do more to test their effectiveness, for example I will design in-class quizzes using Kahoot, Mentimeter, etc. to test whether the content of the videos has been internalised.
I’ve become much quicker at designing the PowerPoint content and less worried about stumbling or searching for the right words to use. I have been able to edit videos more quickly e.g. cutting out excessive time, cropping the end of the video. Embedding videos in Blackboard has also become easier the more I’ve done it. The support information was good, however, I faced a multitude of problems that IT Support had to help me with, which, if I’m honest, was putting me off using the tool (I’m a Mac user mostly using this tool off campus).
Dr Madeleine Davies and Michael Lyons, School of Literature and Languages
The Department of English Literature (DEL) has run two student focus groups and two whole-cohort surveys as part of our Teaching and Learning Development Fund‘Diversifying Assessments’ project. This is the second of two T&L Exchange entries on this topic. Click here for the first entry which outlines how the feedback received from students indicates that their module selection is informed by the assessment models that are used by individual modules. Underpinning these decisions is an attempt to avoid the ‘stress and anxiety’ that students connect with exams. The surprise of this second round of focus groups and surveys is the extent to which this appears to dominate students’ teaching and learning choices.
Having used focus groups and surveys to provide initial qualitative data on our assessment practices, we noticed a widespread preference for alternatives to traditional exams (particularly the Learning Journal), and decided to investigate the reasons for this further. The second focus group and subsequent survey sought to identify why the Learning Journal in particular is so favoured by students, and we were keen to explore whether teaching and learning aims were perceived by students to be better achieved via this method than by the traditional exam. We also took the opportunity to ask students what they value most in feedback: the first focus group and survey had touched on this but we decided this time to give students the opportunity to select four elements of feedback which they could rank in order or priority. This produced more nuanced data.
The results in the ‘Feedback’ sections are valuable for DEL: they indicate that clarity, diagnosis, and solutions-focused comments are key. In addressing our feedback conventions and practices, this input will help us to reflect on what we are doing when we give students feedback on their work.
The results of the focus group and of the subsequent survey do, however, raise some concerns about the potential conflict between ‘student choice’ and pedagogical practice. Students indicate that they not only want to avoid exams because of ‘stress’, but that they would also like to be able to select assessment methods within modules. This poses problems because marks are in part produced ‘against’ the rest of the batch: if the ‘base-line’ is removed by allowing students to choose assessment models, we would lack one of the main indicators of level.
In addition, the aims of some modules are best measured using exams. Convenors need to consider whether a student’s work can be assessed in non-exam formats but, if an exam is the best test of teaching and learning, it should be retained, regardless of student choice.
If, however, students overwhelmingly choose non-exam-based modules, this would leave modules retaining an exam in a vulnerable position. The aim of this project is to find ways to diversify our assessments, but this could leave modules that retain traditional assessment patterns vulnerable to students deselecting them. This may have implications for benchmarking.
It may also be the case that the attempt to avoid ‘stress’ is not necessarily in students’ best interests. The workplace is not a stress-free zone and it is part of the university’s mission to produce resilient, employable graduates. Removing all ‘stress’ triggers may not be the best way to achieve this.
With thanks to Lauren McCann of TEL for sending me the first link which includes a summary of students’ responses to various types of ‘new’ assessment formats.
https://www.facultyfocus.com/articles/online-education/assessment-strategies-students-prefer/
The ‘Diversifying Assessment in DEL’ TLDF Mini-Project revealed several compelling reasons for reflecting upon assessment practice within a traditional Humanities discipline (English Literature):
Rachel Warner, School of Arts and Communication Design
Rachel.Warner@pgr.reading.ac.uk
Jacqueline Fairbairn, Centre for Quality Support and Development
Rachel in Typography and Graphic Communication (T&GC) worked with the Technology Enhanced Learning (TEL) team to rethink an assignment workflow, to improve the student/staff experience when dealing with video submissions. Changes were made to address student assessment literacies, develop articulation skills, support integration between practice and reflection, and make use of OneDrive to streamline the archiving and sharing of video submissions via Blackboard.
This work resulted in students developing professional ‘work skills’ through the assessment process and the production of a toolkit to support future video assessments.
The activity was undertaken for Part 1 students learning skills in design software (e.g. Adobe Creative apps). The assignment required students to submit a ‘screencast’ video recording that demonstrated a small task using design software.
Rachel wanted to review the process for submitting video work for e-assessment, and find ways to streamline the time intensive marking process, particularly in accessing and reviewing video files, without compromising good assessment practice. This is also acknowledged by Jeanne-Louise Moys, T&GC’s assessment and feedback champion: “Video submissions help our students directly demonstrate the application of knowledge and creative thinking to their design and technical decisions. They can be time-consuming to mark so finding ways to streamline this process is a priority given our need to maintain quality practices while adapting to larger cohorts.’”
The TEL team was initially consulted to explore processes for handling video submissions in Blackboard, and to discuss implications on staff time (in terms of supporting students, archiving material and accessing videos for marking). Designing formative support and improving the assessment literacy of students was also a key driver to reduce the number of queries and technical issues when working with video technologies.
Rachel consulted TEL, to discuss:
To address the issue of storing video work, students were asked make use of OneDrive areas to store and submit work (via ‘share’ links). Use of OneDrive encouraged professional behaviours such as adopting a systematic approach to file naming, and it meant the videos were securely stored on university systems using a well-recognised industry standard platform.
To further encourage professional working, students were required to create a social media account to share their video. YouTube was recommended; it is used prolifically by designers to showcase work and portfolios, and across wider professional settings.
Students were provided with a digital coversheet to submit URLs for both the OneDrive and YouTube videos.
The most effective intervention was the introduction of a formative support session (1.5hr). Students practiced using their OneDrive area, set up YouTube accounts and reviewed examples of screencasts. This workshop supported students to understand the professional skills that could be developed through this medium. The session introduced the assessment requirements, toolkit, digital coversheet and allowed students to explore the technologies in a supported manner (improving students’ assessment literacy!)
The assignment instructions were strategically revised, to include information (‘hints and tips’) to support the students’ development of higher production values and other associated digital literacies for the workplace (such as file naming conventions, digital workflows, and sourcing online services).
Students were provided with the option to self-select recording/editing software to undertake the screencast video. Recommended tools were suggested, that are free to use and which students could explore. ‘Screencast-o-matic’ and ‘WeVideo’ provide basic to intermediate options.
Marking the submissions was made easier by the ability to access videos through a consistent format, using a clearly structured submission process (digital coversheet). The ability to play URL links directly through OneDrive meant Rachel was able to store copies of the videos into a central area for future reference. Students also provided a written summary of their video, highlighting key video timings that demonstrate marking criteria (so the marker does not have to watch whole video).
Rachel rationalised her approach to marking by developing a spreadsheet, which allowed her to effectively cross reference feedback against the assessment criteria (in the form of a rubric) and between assignments. This greatly speeded up the marking workflow and allowed Rachel to identify patterns in students work, where common feedback statements could be applied, as appropriate.
The assessment highlighted gaps in students existing digital literacies. The majority of students had not made a video recording before and many were apprehensive about speaking into a microphone. After the completion of the screencasts, previously unconfident students noted in their module reflections that the screencast task had developed their confidence to communicate and explore a new technology.
The modifications to the assessment:
Future modifications to the screencast submission:
Toolkit basics:
Screencast example: (YouTube link) This screencast was altered and improved after submission and marking, taking onboard feedback from the assessment and module. The student noted ‘After submission, I reflected on my screencast, and I changed the original image because it was too complex to fit into the short time that I had available in the screencast. I wanted to use the screencast to show a skill that I had learned and the flower was simple enough to showcase this’. Part of the module was to be reflective and learn from ‘doing’, this screencast is an example of a student reflecting on their work and improving their skills after the module had finished.
Screencast example: (YouTube link) This screencast was a clear and comprehensive demonstration of a technique in PhotoShop that requires multiple elements to achieve results. It has a conclusion that demonstrates the student’s awareness that the technique is useful in other scenarios, other than the one demonstrated, giving the listener encouragement to continue learning. The student has used an intro slide and background music, demonstrating exploration with the screencast software alongside compiling their demonstration.
Screencast example: (YouTube link) This demonstrates a student who is competent in a tool, able to use their own work (work from another module on the course) to demonstrate a task, and additionally includes their research into how the tool can be used for other tasks.
Other screencast activity from the Typography & Graphic Communication department from the GRASS project: (Blog post) Previous project for Part 1s that included use of screencasts to demonstrate students’ achievements of learning outcomes.
Dr Madeleine Davies, Department of English Literature
m.k.davies@reading.ac.uk
In 2017 I replaced the exam on a Part 3 module I convene (‘Margaret Atwood’) with an online learning journal assessment and I was so impressed with the students’ work that I sought funding to publish selected extracts in a UoR book, Second Sight: The Margaret Atwood Learning Journals. The project has involved collaboration between the Department of English Literature and the Department of Typography & Graphic Communication, and it has confirmed the value of staff-student partnerships, particularly in relation to celebrating student attainment and enhancing graduate employability.
The ‘Margaret Atwood’ module has always been assessed through an exam and a summative essay but I was dissatisfied with the work the exam produced (I knew that my students could perform better) so I researched alternative assessment formats. In 2017 I replaced the exam with a Blackboard learning journal because my research suggested that it offered the potential to release students’ creative criticality. I preserved the other half of the assessment model, the formal summative essay, because the module also needed an assessment where polished critical reading would be rewarded. With both assessment elements in place, students would need to demonstrate flexible writing skills and adapt to different writing environments (essential graduate skills). A manifest benefit of journal assessment is that it offers students to whom essay-writing does not come easily an opportunity to demonstrate their true ability and engagement so the decision to diversify assessment connected with inclusive practice.
I decided to publish the students’ writing in a UoR book because I did not want to lose their hard work to a digital black-hole: it deserved a wider audience. I sought funding from our Teaching and Learning Deans, who supported the project from the beginning, and I connected with the ‘Real Jobs’ scheme in the Department of Typography & Graphic Communication where students gain valuable professional experience by managing funded publishing commissions for university staff and external clients. This put me in contact with a highly skilled student typographer with an exceptional eye for design. I asked a member of the ‘Margaret Atwood’ group to help me edit the book because I knew that she wanted to pursue a career in publishing and this project would provide invaluable material for her CV. Together we produced a ‘permissions’ form for students to formally indicate that they were releasing their work to the publication and 27 out of 36 students who were enrolled on the Spring Term module responded; all warmly welcomed the initiative. Contributors were asked to submit Word files containing their entries so as to preserve the confidentiality of their online submissions; this was important because the editors and designers were fellow students. Throughout the Summer Term 2018, the students and I met and planned, designed and edited, and the result is a book of which we are proud. With the sole exception of the Introduction which I wrote, every element of it, from the cover image to the design to the contents, is the work of our students.
The impact of the project will be registered in terms of Open Days because Second Sightwill help demonstrate the range of staff-student academic and employability activities in DEL. In addition, the project has consolidated connections between DEL and the Department of Typography & Graphic Communication and we will build on this relationship in the next session.
A further impact, which cannot be evidenced easily, is that it provides a useful resource for our graduates’ job applications and interviews: students entering publishing or journalism, for example, will be able to speak to their participation in the project and to their work in the book. The collection showcases some excellent writing and artwork and DEL graduates can attend interviews with tangible evidence of their achievements and abilities.
Producing this book with such talented editors, designers and contributors has been a joy: like the ‘Margaret Atwood’ module itself, Second Sight confirms the pleasures and the rewards of working in partnership with our students.
The project sharpened my own editing skills and created a space to share knowledge about publishing conventions with the students who were assisting me. We all learned a great deal from each other: June Lin, the Typography student designer, gave me and the student editor (Bethany Barnett-Sanders) insights into the techniques of type-setting and page layout. To reciprocate, Bethany and I enhanced June’s knowledge of Margaret Atwood’s work which she had read but never studied. This pooling of knowledge worked to the benefit of us all.
One of the advantages of the learning journal was that it allowed me a clear view of the inventiveness and ingenuity that students bring to their work, and my sense of appreciation for their skill was further enhanced by working with students on the book. Technically, this was less of a ‘staff-student’ collaboration than it was a mutual education between several people.
The process we followed for acquiring written permission from students to include their work in the book, and for gathering Word files to avoid confidentiality issues, was smooth, quick, and could not have been improved. The only difficulty was finding time to edit seventy-five contributions to the book in an already busy term. Whilst this was not easy, the results of the collaboration have made it well and truly worth it.
It is too early to tell whether other DEL colleagues will choose to diversify their own assessments and pursue a publishing project similar to the ‘Margaret Atwood’ example if they do. There is, however, a growing need for Open Day materials and Second Sight joins the Department’s Creative Writing Anthology to demonstrate that academic modules contain within them the potential for publication and collaborative initiative. I will certainly be looking to produce more publications of this nature on my other learning journal modules in the next session; in the meantime, copies of Second Sight will be taken with me to the outreach events I’m attending in July in order to demonstrate our commitment to student engagement, experience and employability here at the University of Reading.
Dr Maria Kambouri-Danos, Institute of Education
m.kambouridanos@reading.ac.uk
Year of activity 2016/17
This entry aims to share the experience of re-designing and evaluating assessment in collaboration with students. It explains the need for developing the new assessment design and then discusses the process of implementing and evaluating its appropriateness. It finally reflects on the impact of MCQ tests, when assessing students in higher education (HE), and the importance of engaging students as partners in the development of new assessment tools.
Child Development (ED3FCD) is the core module for the BA in Children’s Development and Learning (BACDL), meaning that a pass grade must be achieved on the first submission to gain a BA Honours degree classification (failing leads to an ordinary degree). The assessment needed to be redesigned as it put the total weight of students’ mark on one essay. As the programme director, I wanted to engage the students in the re-design process and evaluate the impact of the new design on both students and staff.
After attending a session on ‘Effective Feedback: Ensuring Assessment and Feedback works for both Students and Staff Across a Programme’ I decided to explore more the idea of using Multiple Choice Tests (MCQ). To do so, I attended a session on ‘Team Based Learning (TBL)’ and another on ‘MCQ: More than just a Test of Information Recall’, to gather targeted knowledge about designing effective MCQ questions.
I realised that MCQ tests can help access students’ understanding and knowledge and also stimulate students’ active and self-managed learning. Guided by the idea of ‘assessment for learning’, I proposed the use of an MCQ test during a steering group meeting (employees and alumni) and a Board of Studies (BoS) meeting, which 2nd year Foundation Degree as well as BACDL student representatives attended. The idea was resisted initially, as MCQ tests are not traditionally used in HE education departments. However, after exploring different options and highlighting the advantages of MCQ tests, the agreement was unanimous. At the last BoS meeting (2016), students and staff finalised the proposal for the new design, proposing to use the MCQ test for 20% of the overall mark, keeping the essay for the remaining 80%.
At the beginning of 2017, I invited all BACDL students to anonymously post their thoughts and concerns about the new design (and the MCQ test) on Padlet. Based on these comments, I then worked closely with the programme’s student representatives and had regular meetings to discuss, plan and finalise the assessment design. We decided how to calculate the final mark (as the test was completed individually and then in a group) as well as the total number of questions, the duration of the test, etc. A pilot study was then conducted during which a sample MCQ test was shared with all the students, asking them to practise and then provide feedback. This helped to decide the style of the questions used for the final test, an example of which is given below:
There are now more than one million learners in UK schools who speak English as an additional language (EAL). This represents a considerable proportion of the school population, well above 15 per cent. To help EAL children develop their English, teachers should do all the following, except…
a. use more pictures and photographs to help children make sense of new information.
b. use drama and role play to make learning memorable and encourage empathy.
c. maintain and develop the child’s first language alongside improving their English.
d. get children to work individually because getting them into groups will confuse them and make them feel bad for not understanding.
e. provide opportunities to talk before writing and use drills to help children memorise new language.
Students were highly engaged in the process of developing the new design, and the staff-student collaboration encouraged the development of bonds within the group. The students were excited with the opportunity to actively develop their own course and the experience empowered them to take ownership of their own learning. All of them agreed that they felt important and as a student representative said, “their voices were heard”.
The new design encouraged students to take the time to gauge what they already know and identify their strengths and weaknesses. Students themselves noted that the MCQ test helped them to develop their learning as it was an additional study opportunity. One of them commented that “…writing notes was a good preparation for the exam. The examination was a good learning experience.” Staff also agreed that the test enabled students to (re)evaluate their own performance and enhance their learning. One of the team members noted that the “…test was highly appropriate for the module as it offered an opportunity for students to demonstrate their proficiency against all of the learning outcomes”.
The new assessment design was implemented successfully because listening to the students’ voice and responding to their feedback was an essential part of the designing process. Providing opportunities to both students and staff to offer their views and opinions and clearly recognising and responding to their needs were essential, as these measures empowered them and helped them to take ownership of their learning.
The BACDL experience suggests that MCQ tests can be adapted and used for different subject areas as well as to measure a great variety of educational objectives. Their flexibility means that they can be used for different levels of study or learning outcomes, from simple recall of knowledge to more complex levels, such as the student’s ability to analyse phenomena or apply principles to new situations.
However, good MCQ tests take time to develop. It is hoped that next year the process of developing the test will be less time-consuming as we already have a bank of questions that we could use. This will enable randomisation of questions which will also help to avoid misconduct. We are also investigating options that would allow for the test to be administered online, meaning that feedback could be offered immediately, reducing even further the time/effort required to mark the test.
MCQ tests are not a panacea; just like any other type of assessment tool, MCQ tests have advantages and limitations. This project has confirmed that MCQ tests are adaptable and can be used for different subject areas as well as to measure a great variety of educational objectives. The evaluation of the assessment design will continue next year and further feedback will be collected by the cohort and next year’s student representatives.
Dr. Ute Woelfel, Literature and Languages
u.wolfel@reading.ac.uk
Year of activity: 2016/17
Overview
In order to improve students’ engagement, support their abilities as independent learners, and increase their feeling of ownership for their academic work, elements of independent research and research dissemination through the creation of research posters were included in a Part 2 module.
Objectives
Context
In 2016/17 I introduced a new Part Two module on German National Cinema (GM2CG: 20 credits/ 30 contact hours). The module is intended to give students a general overview of German cinema from the end of World War I to German unification and at the same time allow sustained independent work on themes of interest. In order to increase the engagement with the themes, the independent work is research-oriented demanding from students to reflect their own expectations and aims, their goals for the module and indeed the course, and develop their own interest and approach.
Implementation
The students were asked in the beginning to pick a period or topic from a list and prepare a presentation. The presentation was not part of the summative assessment but served as a foundation for further research. After the presentation, individual discussions with each student were used to decide which aspect of the theme/topic the student would like to pursue further. After each term, essay surgeries were offered in which students were given the opportunity to discuss the research done so far and decide a concrete research question for their essay (2,500 words/ 30%). The students were then asked to turn the findings of their essays into research posters for dissemination to non-specialist audiences (10%). In order to make sure that students also gain a general understanding of German cinema, a final exam (60%) is scheduled in the summer term.
Impact
The inclusion of independent research elements was very successful in that students did engage more than they normally do when given set topics and essay titles. The majority of students found secondary sources, even additional primary sources, and often identified research topics they would like to pursue in the future. Both the essay and the exam marks were above average. The poster challenged students to re-think their academic findings and present them in a new, visually organised, format for interested general audiences; as we used the posters to showcase the students’ work at the University’s Languages Festival, the Visit Days and a Reading Scholars outreach event, a sense of the importance of their work emerged as well as pride in what they had achieved grew. The students understood the relevance of the poster for the development of professional skills.
Reflections
The module worked well and highlighted most of all the potential our students have and can develop in the right learning environment as well as their willingness to work hard when they are committed. Their engagement with independent research signalled a wish to get active and explore options beyond the set class texts rather than being spoon-fed; there is a clear need for feeling involved, responsible and in charge of work. I was particularly surprised about how much effort students were prepared to put into the presentations despite the fact that they did not count towards the module mark; as they were used as foundation for assessment, students clearly understood their benefit.
The research elements made the module learning and teaching intensive as a good number of office hours and slots during the enhancement weeks were used for individual discussions of research and essay topics; as I want the students to put their research posters to good use as well, additional feedback slots were offered in which I discussed not just marks but ways of improving the posters; students showed great willingness to work even further on their posters just to see them exhibited, despite the fact that any further input would not change the mark.
Dr Nicola Abram, School of Literature and Languages
Overview
This entry describes the use of online Learning Journals on a Part Three English Literature module. This method of assessment supports students to carry out independent research and to reflect on their personal learning journey, and rewards students’ sustained engagement and progress.
Objectives
Context
The Part Three optional module Black British Fiction (EN3BBF) is characterised by a large number of set texts that are read at a fast pace. During a single term it covers the period from 1950 to the present day, and asks students to engage with novels, short stories, poetry, a play, and a film, as well as critical theory, history, autobiography, documentary, blogs, political speeches, and press reviews. The module is also characterised by its relevance to historical and contemporary issues of social justice. The quantity and complexity of this material requires students to exercise their independence, taking responsibility for their learning beyond the weekly three hours of tutor-led seminars.
Learning Journals had been in use for this and other modules in the Department of English Literature for several years, in the format of paper workbooks pre-printed with set questions. This effectively served the purpose of structuring students’ weekly studies and directing discussion in seminars. Students worked extremely hard to record their learning in this format, often going beyond the standard material to include additional reading and research of relevance to the module.
However, the paper workbook sometimes resulted in an excess of material that was diluted in focus and difficult to evaluate. Another problem was that the handwritten Journal was retained by the University after submission, meaning students lost this rich record of their learning.
To improve this situation, consultations were held with colleagues in the Department of English Literature and an alternative online Learning Journal was initiated in 2015/16.
Implementation
Experimentation with the Blackboard Journals tool helped to clarify its privacy controls, to ensure that tutors could see the work of all participating students but that students could not see each other’s entries. A discussion with the University of Reading TEL team clarified marking procedures, including making the Journal entries available to view by external examiners.
A discussion was held with colleagues who use paper or online Learning Journals, to establish generic assessment criteria and ensure parity of expectations.
In discussion with another module convenor it was decided that students would be required to submit ten weekly entries, each consisting of 400-500 written words or 4-5 minutes of audio or film recording. The choice of media was a proactive effort to make the Journal more accessible to students with dyslexia and those for whom English is an additional language. The subject of each entry could be determined by the student, prompted by questions on the reading list, discussion in seminars, personal reading, or other activities such as attendance at an exhibition or event.
In the first term of implementation (Autumn 2015) the full ten entries were assessed. In later iterations it was decided that students should instead select five entries to put forward for summative assessment. The selection process facilitates further self-reflection, and the option to discard some entries allows for experimentation without the threat of penalty.
The Learning Journal incorporates a vital formative function: students are invited to a 30-minute feedback tutorial to discuss their first five entries. This conversation refers to the module-specific and task-specific assessment criteria, supporting students to reflect on their work so far and to make plans to fill any gaps. The Learning Journal functions as a mode of assessment for learning, replacing the traditional task of the formative essay.
In terms of summative assessment, the five submitted Learning Journal entries account for 50% of the module mark. An essay constitutes the other 50%. These two forms of assessment are equivalent in scale, with each carrying a guideline of 2,500 words total.
Impact
The fact that students could nominate a selection of entries for summative assessment seemed to encourage risk-taking. Students were more willing to experiment with their critical responses to texts – by testing speculative interpretations, asking questions, or articulating uncertainty – and to express their ideas using creative practices. They became actively engaged in directing both the form and content of their learning.
The move to a restricted length per entry was designed to encourage students to distil their ideas, and to direct attention to the aspects of that week’s learning that most mattered to the student. This was successfully achieved, and feedback shows that they could see their own progress as the weeks passed.
Feedback also showed that students appreciated the opportunity to choose their own topic for each weekly entry, without the constraints of set questions. As a result, entries were remarkably varied. Some students took the opportunity to reflect on their personal circumstances or current political contexts (such as the construction of ‘Britain’ in the discourse around the EU referendum in 2016) using the technical vocabulary learned on the course; others explored creative media such as spoken word poetry. All students gained skills in a genre of writing different from the traditional essay format, which may prove useful for careers in the communication industries.
One unexpected benefit was that the online journal made it possible for the module convenor to track the students’ learning in real-time rather than waiting for summative assessments and end-of-term evaluations. This immediate insight enabled corrective action to be taken during the course of the module where necessary.
Reflections
Students were initially nervous about this unfamiliar method of assessment. Providing detailed module-specific and task-specific marking criteria, as well as example entries, helped to allay these fears. The decision to count only a selection of entries towards summative assessment significantly helped, allowing students to acclimatise to the task with more confidence. As the term progressed, students visibly transitioned towards autonomous learning.
The Learning Journal format proved particularly effective for this module as it created a ‘safe space’ in which students could reflect on the ways in which they have personally experienced, witnessed, or practised racism. Students’ self-reflection extended beyond the subject of skills, strengths and weaknesses to consider their embodied knowledge, ignorance, or privilege. They became more critical in their thinking and more alert and responsible as citizens. Articulating the potency of this real-world engagement, one student commented that “the consistency of the learning journal […] allowed my thinking to naturally mature and changed my outlook on society”.
Marking the Journals became much more efficient using the online format, as entries were typewritten and significantly condensed. Additionally, marking and moderating could be done remotely, without the need to exchange cumbersome documents in person.
It is striking that some students achieving high marks in their Learning Journals did not always achieve equivalent marks in their essays or other modules. I do not consider this to indicate an artificial inflation of grades; rather, I would argue that the Journal recognises and rewards skills that are overlooked in traditional assessment formats and undervalued elsewhere on our programmes. Some students used the Journal to record their personal contribution to seminar discussions and be rewarded for this, while for other students less likely to speak in class (perhaps due to EAL status, gender, disability, or personality) the private entries provided an important opportunity for their insights to be heard.
Follow up
Informal spoken feedback on the general use of Learning Journals was given to the group during seminars, and one-to-one feedback was given halfway through the module. However, several students sought additional reassurance about their entries. In 2017/18 I intend therefore to incorporate a peer-review exercise into the early weeks of the term, to allow students to benchmark their work against others’ and to promote the take-up of alternative media and approaches. This activity will help students to see themselves as a community of learners. Rather than presume that students have access to technology I will supply iPads belonging to the School of Literature and Languages for use in the classroom.
I also intend to circulate example entries in audio and video formats, to show that the Journal validates skills other than traditional essay-writing and to encourage students to experiment with alternative ways of demonstrating their learning.
Dr. Nicola Abram, Literature and Languages
n.l.abram@reading.ac.uk
Year of activity: 2015-6
Overview
This entry describes the use of screencasts to deliver skills training on a compulsory Part One English Literature module. As a result of the changes outlined here, every student taking English Literature at the University of Reading will have access throughout their degree to a bank of online resources teaching key skills.
Objectives
Context
Over 200 students enter English Literature programmes at the University of Reading each year, from a range of educational backgrounds. To ensure they all have the key skills and theoretical understanding needed to succeed throughout their degrees, we run a compulsory module in Part One (first year) called ‘Research & Criticism’ (EN1RC).
In the previous incarnation of the module, the Autumn Term had been used for a series of 50 minute lectures on research methods, such as ‘Using online sources’, ‘Using published sources’, ‘Citations and referencing’, and ‘Academic writing’. Students also attended a 50 minute seminar each week, the content of which was determined by the seminar tutor. The Spring Term lectures and seminars then inducted students into foundational critical ideas like ‘narrative’, ‘reader’ and ‘author’, as well as issues such as ‘gender and sexuality’, and ‘race and empire’, via a series of set texts.
I was tasked with convening this module from 2014/15. On my appointment, I sought to engage students as more active participants in the skills training component.
Implementation
The process for developing this module began with an informal conversation with another tutor. We identified a disparity between the module content and the mode of delivery: the traditional lecture format did not seem to be the best vehicle for delivering skills training.
Believing that skills training is most effectively conducted through practical and interactive activities, I set about constructing a series of short formative tasks that would enable students to learn by doing. These were designed to break down the process of research and writing into its component parts, so that students could amass the necessary skills bit by bit. Feedback would be given quickly – usually the following week – by their seminar tutor, meaning changes could be implemented prior to attempting a summative (assessed) essay. The specific formative tasks set were: assembling a bibliography, integrating quotation into a short critical commentary, preparing an essay plan, summarising a fiction text, précising a critical text, and drafting an essay introduction.
Students were supported to undertake each task by a screencast: a short (3-5 minute) animation giving the key information about a particular skill and signposting further resources, which students could watch at their own pace and return to at leisure. Screencasts were released to students on a controlled basis via a dedicated area on the module’s Blackboard pages, accompanying the instructions for each formative task. Upon completion of the module, students had therefore engaged with a bank of ten different screencasts. They retain access to this throughout their degrees, via Blackboard.
Most of the screencasts were prepared using the screen capture programme, Camtasia, for which we have multiple departmental licenses. Colleagues who had previously delivered the skills lectures were given the technical support (where necessary) to repurpose that material into a screencast, and others were invited to volunteer new material. A colleague in Study Advice also contributed a screencast tailored to the needs of English Literature students. This collaborative approach produced a welcome range of different outcomes. Some colleagues used PowerPoint to present written and visual content, while others used Prezi, which better represents the spatial arrangement of the material. Some recorded a voiceover, which provided a welcome sense of connection with an individual tutor, while others chose to use a musical soundtrack downloaded from a royalty-free website such as www.incompetech.com. A few colleagues used the animation tools PowToon and VideoScribe, rather than simply recording a presentation onscreen.
A meeting with staff teaching on the module was held at the end of its first term and after its first full year. Their reflections on students’ submitted tasks and classroom engagement proved invaluable for the module’s iterative design.
Impact
As a result of this module, students are evidently more alert to the many components of professional writing and are better equipped to perform good academic practice. Selected comments from qualitative module evaluations affirmed the usefulness of this immersive model of skills training: “The first [formative] tasks such as the bibliography were very useful to bridge the gap into HE”, “All the feedback I received was very helpful and helped me improve my work”, and “The screencasts were also a fantastic idea”.
The screencasts have been watched multiple times by students, suggesting that they are a useful resource that can be returned to and referred to repeatedly. The current most-watched is ‘Incorporating quotations’, which has had 969 views since it was uploaded in January 2015.
Using screencasts as a teaching delivery tool has also provided the opportunity to develop the content of the course. Removing the skills content from lectures freed up contact time to be given to important theoretical material and set texts.
Reflections
The model of interactive skills training harnesses the power of constructive alignment, where teaching process and assessment method are calculated to maximise students’ engagement with the subject and/or skills being taught. Even for a discursive discipline like English, the QAA Subject Benchmark Statement encourages assessments “aimed at the development of specific skills (including IT and bibliographical exercises)”.
Although I did not have a particular student demographic in mind when making these changes, the staged development of writing skills seems to offer specific support to international students and English as additional language (EAL) learners, who may be unfamiliar with UK academic conventions and benefit from an atomised approach to writing with regular formative feedback. However, all students benefit from this formal induction to academic literacy. Running a core skills module has an equalising effect on the cohort, compensating for disparities in prior educational contexts and attainment.
Embedding the screencasts to view on Blackboard Learn was awkward since they could not be watched inline by users whose devices did not support a specific plugin. Screencasts were therefore hosted on www.screencast.com, with stable links provided in Blackboard Learn. Both uploading and viewing were easy and effective, but the cap on bandwidth (2GB per month) meant a need to upgrade to a paid-for subscription (currently £8.36 per month) in months where traffic was particularly high. In future I will consider using YouTube, with appropriate privacy settings, to continue the periodic release of screencasts through link-only access.
Follow up
As of 2016-17, the module continues to run using screencasts as a key teaching method. Additional screencasts have been added to the suite as need arose, for instance to support students’ use of Turnitin as a formative tool, in line with University of Reading strategy. Some screencasts have been replaced as a result of staff turnover. But most remain in use, meaning that the initial work to prepare the content and conduct the screen capture continues to pay off.
Various colleagues in the Department of English Literature have found screencasts to be a useful method for wider skills training. We are now preparing a suite of screencasts to support prospective students and new entrants with the transition to higher education, on topics like ‘What is a lecture?’ and ‘How should I communicate with my tutors?’. We also use screencasts more widely, including as a student assessment method: some of these, along with our public-facing promotional videos, have been given British Sign Language interpretation (contact Dr Cindy Becker for details).
Work is now being undertaken to enhance the training component of the module further through Technology Enhanced Learning, by using quizzes on Blackboard Learn to provide students with immediate feedback on their understanding of skills like proper referencing practice.
Links
Academic Writing: Essay presentation & proof-reading:http://www.screencast.com/t/EXn2au7r8Wj7
Writing a critical precis: http://www.screencast.com/t/83Wz0I4rA
Citations and referencing: http://www.screencast.com/t/aT8PolyDuH
English Literature at the University of Reading YouTube Channel: https://www.youtube.com/user/EnglishAtReading
Dr Emma Mayhew, Politics, Economics and International Relations
e.a.mayhew@reading.ac.uk
Year of activity: 2015/16
Overview
PLanT funding was used to research the impact of a new student feedback platform in Politics. Unitu creates an online student forum from which representatives pull issues onto a separate departmental board. Academics can then add responses and show if an issue has been actioned or closed by dragging between columns.
Objectives
Context
Increasingly, providers are looking for alternative ways to encourage more continuous student engagement by opening channels of communication between staff and students to target further improvements to the student experience. This is particularly timely given the Teaching Excellence Framework and changes to National Student Survey questions which stress the importance of the student voice.
Implementation
In order to investigate student uptake, usage and impact of Unitu, the project team adopted a three stage approach:
Impact
We now have easily accessible sign up and usage figures across the year. We can see how sign up figures respond to our promotional activity. We have a much better understanding of why some students were not aware of the platform, how some students encountered initial technical difficulties with sign up, why some purposefully prefer not to engage with Unitu and, for those that have, which features are of particular use and which are not. This data has led to changes in our approach to student communications and liaison with the software provider to amend some of the features offered.
Reflections
Although some features were problematic, such as numerous ‘new post’ email notifications, the overall response was positive. 58% of students enrolled onto the platform. 55 issues, questions or praise were posted, prompting 5,500 student views of follow on discussion. 52% found Unitu increased student representative profiles. 61% felt it improved the student voice. 75% felt it showed exactly how the Department responded to student feedback. Some changes were made to teaching provision in response to student feedback including addressing deadline clusters and balancing assessed and non-assessed presentations. Notably the platform offers academic colleagues the opportunity to explore the pedagogical rationale behind curriculum design and assessment decisions. But we do remain mindful of the way in which Unitu might lead to difficulties managing student expectations in terms of the timing and nature of responses as well as the impact of adopting a very open discussion forum which does require clear rules of engagement.
Follow up
We have started work on broader dissemination of our experiences. In September 2016 a Part Two student, Jack Gillum, presented as part of a broader University of Reading symposium at the Researching, Advancing & Inspiring Student Engagement (RAISE) conference in Loughborough. Unitu is now being considered by Computer Science and the School of Construction Management and Engineering. We would like to continue to share our experiences with new adopters.sp