Taking Academic Language and Literacy Courses Online

Dr Karin Whiteside, ISLI

Overview

Alongside its embedded discipline-specific provision, the Academic English Programme (AEP) offers a range of open sign-up academic language and literacy courses each term. This case study outlines the process of rapidly converting the summer term provision online, and reports student feedback and reflections on the experience which will help inform continued online delivery this autumn term.

Objectives

Our aim was to provide academic language and literacy support which, as far as practicably possible, was equivalent in scope and quality to our normal face-to-face offering for the same time of year. In summer term, our provision is particularly important for master’s students working on their dissertations, with high numbers applying for Dissertation & Thesis Writing, but courses such as Core Writing Skills and Academic Grammar also providing important ‘building block’ input needed for competent research writing.

Context

Prior to the COVID crisis, our face-to-face courses on different aspects of written and spoken Academic English have been offered for open application on a first-come-first served basis, with a rolling weekly waiting list. With a maximum of 20 students per class, we have been able to offer interactive, task-based learning involving analysis of target language and communicative situations in context, practice exercises and opportunity for discussion and feedback within a friendly small-group environment.

Implementation

Within an extremely tight turnaround time of four weeks to achieve this, we determined a slightly slimmed down programme of five ‘open-to-all’ online courses –  Academic Grammar, Core Academic Writing Skills, Dissertation & Thesis Writing, Essays: Criticality, Argument, Structure and Listening & Note-taking – and replaced our normal application process with self-enrolment via Blackboard, meaning uncapped numbers could sign up and have access to lessons.

Time restraints meant we had to be pragmatic in terms of where to focus our energies. Conversion of course content online needed to be done in a way that was both effective and sustainable, thinking of the potential continued need for online AEP provision going into 2020/21. We predicted (rightly!) that the process of initially converting small-group interactive learning materials to an online format in which their inductive, task-based qualities were retained would be labour-intensive and time-consuming. Therefore, for the short term (summer 2020) we adopted a primarily asynchronous approach, with a view to increasing the proportion of synchronous interactivity in future iterations once content was in place. In terms of converting face-to-face lessons to online, we found what often worked most effectively was to break down contents of a two-hour face-to-face lesson into 2-3 task-focused online parts, each introduced and concluded with short, narrated PowerPoints/MP4 videos. We determined a weekly release-date for lesson materials on each course, often accompanied by a ‘flipped’ element, labelled ‘Pre-lesson Task’, released a few days prior to the main lesson materials. We set up accompanying weekly Discussion Forums where students could ask questions or make comments, for which there was one ‘live’ hour per week. Apart from Pre-Lesson Tasks, task answers were always made available at the same time as lessons to allow students complete autonomy.

Moving rapidly to online delivery meant not necessarily having the highest specification e-learning tools immediately to hand but instead working creatively to get the best out of existing technologies, including the Blackboard platform, which prior to this term had had a mainly ‘depository’ function in AEP. To ensure ease of navigation, the various attachments involved in creating such lessons needed to be carefully curated by Folder and Item within BB Learning Materials. Key to this was clear naming and sequencing, with accompanying instructions at Folder and Item level.

Impact, Reflections and Follow-up

Positive outcomes of taking the summer AEP provision online have included noticeably higher uptake (e.g. in Academic Grammar, 92 self-enrolments compared to 30 applications in summer term 2018/19) and noticeably higher real engagement (e.g. with an average of 11 students attending the 2018/19 summer face-to-face Academic Grammar class, compared to a high of 57 and average of 38 students accessing each online lesson). Running the courses asynchronously online has meant no waiting lists, allowing access to course content to all students who register interest. It also means that students can continue to join courses and work through materials over the summer vacation period, which is particularly useful for international master’s students working on Dissertations for September submission, and for cohorts overseas such as the IoE master’s students in Guangdong.

In survey responses gathered thus far, response to course content has been largely positive: “It provided me an insight into what is expected structure and criticality. Now that I am writing my essay, I could see the difference”. Students appreciated teacher narration, noticing if it was absent: “I would prefer our teacher to talk and explain the subject in every slide.” The clarity of lesson presentation within Blackboard was also noted: “I think the most impressive part in this course is the way these lessons were arranged in BB as every lessons were explicitly highlighted, divided into parts with relevant tasks and their answers. Thus, I could effectively learn the content consciously and unconsciously.”

There were a range of reactions to our approach to online delivery and to online learning more generally.  52% of students were happy with entirely asynchronous learning, while 48% would have preferred a larger element of real-time interactivity: “Although this lessons ensured the freedom in dealing with the material whenever it was possible, the lack of a live-scheduled contact with the teacher and other students was somewhat dispersive.”; “I prefer face to face in the classroom because it encourages me more to contribute”. In normal circumstances, 34% of students said they would want entirely face-to-face AEP classes, whilst 21% would want a blended provision and 45% would prefer learning to remain entirely online, with positive feedback regarding the flexibility of the online provision: “it’s flexible for students to do it depending on their own time.”; “Don’t change the possibility to work asynchronously. It makes it possible to follow despite being a part time student.” Going forward, we plan to design in regular synchronous elements in the form of webinars which link to the asynchronous spine of each course to respond to students’ requests for more live interactivity. We also plan to revisit and refine our use of Discussion Forums in Blackboard. Whilst engagement of lesson content was high, students made limited use of Q&A Forums. It is hoped that more targeted forums directly linked to flipped tasks will encourage greater engagement with this strand of the online delivery in the future.

Links

The AEP website ‘Courses, Workshops and Webinars’ page, which gives details of this summer term’s courses and what will be on offer in autumn: http://www.reading.ac.uk/ISLI/enhancing-studies/academic-english-programme/isli-aep-courses.aspx

Developing Diversity and Inclusion teaching: The importance of D&I and Ethical Practice

Dr Allán Laville, Psychology and Clinical Language Sciences, a.laville@reading.ac.uk

Overview

In the training of Psychological Wellbeing Practitioners (PWPs), teaching must include a focus on Diversity and Inclusion (D&I) as well how this relates to ethical practice. Therefore, I created a 15-minute screencast that tied key D&I principles to clinical practice, with a particular focus on ethical practice within this area.

Objectives

  1. To support students in being aware of key D&I and ethical principles and how these principles relate to their clinical practice.
  2. To support students in writing a 500-word reflective piece on the importance of considering D&I in their ethically-sound, clinical practice.

Context

PWP programmes include D&I training within the final module of the clinical programme, but to meet the British Psychological Society (BPS) programme standards, D&I training needs to be incorporated throughout. Furthermore, this training should be tied to the BPS programme standard on Ethical Practice teaching (Module PY3EAA1/PYMEAA).

Implementation

The first step was to identify the key sources to include within the screencast. These were wide ranging from legislation (Equality Act, 2010), positive practice guides (Improving Access to Psychological Therapies) and ethical practice guidelines (British Psychological Society) and reference to the University’s Fitness to Practise policy.

The second step was to think about how students could engage with the screencast in a meaningful way. Based on an earlier T&L Exchange project report of mine (https://sites.reading.ac.uk/t-and-l-exchange/2019/07/23/developing-innovative-teaching-the-importance-of-reflective-practice/), I wanted to include an element of reflective practice. Students were asked to write a 500-word reflective piece on their own take-home points from the screencast and preferably, following the Rolfe, Freshwater, and Jasper (2001) reflective model of: a) what is being considered, b)  so what, which I say to my students is the ‘why care?’ part! And c) now what i.e. from reviewing what and so what, detailing your SMART action plan for future clinical practice.

Example by Will Warley, Part 3 MSci Applied Psychology (Clinical) student.

Impact

The student feedback about the screencast and completing the reflective piece has been very positive. This has been across both the MSci in Applied Psychology (Clinical) as well as the Charlie Waller Institute (CWI), PG (Cert) in Evidence-Based Psychological Treatments (IAPT Pathway). The training materials have also been shared with members of the SPCLS Board of Studies for CWI training programmes.

In regard to national level impact, I have presented this innovative approach to D&I teaching at the BPS Programme Liaison Day, which included the BPS PWP Training Committee and Programme Directors from across the UK. The presentation was received very well including requests to disseminate the materials that we use in the teaching at UoR. Therefore, these materials have now been circulated to all PWP training providers in the UK to inform their D&I provision.

Reflections

One core reason for the success of this activity was the commitment and creativity of our students! Some students used software to create excellent mind maps, interactive presentations or a YouTube video! There was even an Instagram account used to illustrate the main take-home points from the screencast, which I thought was particularly innovative. Overall, I was absolutely delighted to see such high levels of student engagement with topics that are so important – both personally and professionally.

In regard to better implementation, it is possible that slightly more guidance could have been provided regarding how to approach the reflective task, but the brief of ‘be as creative as possible!’ worked very well indeed!

Follow up

I will be following up with the BPS PWP Training Committee in 2020 to see how this activity has developed within other PWP training providers! We will then create a summary of all innovative approaches to including D&I in PWP programmes and how these meet the programme standards.

Links

https://my.cumbria.ac.uk/media/MyCumbria/Documents/ReflectiveModelRolfe.pdf

Student YouTube video as submission on reflective task: https://youtu.be/hMU6F_dknP4

Using Flipped Learning to Meet the Challenges of Large Group Lectures

Adopting a flipped classroom approach to meet the challenges of large group lectures

Name/School/ Email address

Amanda Millmore / School of Law / a.millmore@reading.ac.uk

Overview

Faced with double-teaching a cohort of 480 students (plus an additional 30 in University of Reading Malaysia), I was concerned to ensure that students in each lecture group had a similar teaching experience. My solution was to “flip” some of the learning, by recording short video lectures covering content that I would otherwise have lectured live and to use the time freed up to slow the pace and instigate active learning within the lectures. Students provided overwhelmingly positive feedback in formal and informal module evaluations, the introduction of flipped learning has aided the welfare of students, allowing those who are absent or who have disabilities or language barriers to revisit material as and when needed. For staff, it has aided the reduction in my workload and has the ongoing benefit of reducing workload of colleagues who have taken over teaching the module.

Objectives

  • Record short video lectures to supplement live lectures.
  • Use the time freed up by the removal of content no longer delivered live to introduce active learning techniques within the lectures.
  • Support the students in their problem-solving skills (tested in the end of year examination).

Context

The module “General Introduction to Law” is a “lecture only” first year undergraduate module, which is mandatory for many non-law students, covering unfamiliar legal concepts. Whilst I have previously tried to introduce some active learning into these lectures, I have struggled with time constraints due to the sheer volume of compulsory material to be covered.

Student feedback requested more support in tackling legal problem questions, I wanted to assist students and needed to free up some space within the lectures to do this and “flipping” some of the content by creating videos seemed to offer a solution.

As many academics (Berrett, 2012; Schaffzin, 2016) have noted, there is more to flipping than merely moving lectures online, it is about a change of pedagogical approach.

Implementation

I sought initial support from the TEL (Technology Enhanced Learning) team, who were very happy to give advice about technology options. I selected the free Screencast-O-Matic software, which was simple to use with minimal equipment (a headset with microphone plugged into my computer).

I recorded 8 short videos, which were screencasts of some of my lecture slides with my narration; 6 were traditional lecture content and 2 were problem solving advice and modelling an exemplar problem question and answer (which I had previously offered as straightforward read-only documents on Blackboard).

The software that I used restricted me to 15 minute videos, which worked well for maintaining student attention. My screencast videos were embedded within the Blackboard module and could also be viewed directly on the internet https://screencast-o-matic.com/u/iIMC/AmandaMillmoreGeneralIntroductiontoLaw.

I reminded students to watch the videos via email and during the lectures, and I was able to track the number of views of each video, which enabled me to prompt students if levels of viewing were lower than I expected.

By moving some of the content delivery online I was also able to incorporate more problem-solving tasks into the live lectures. I was able to slow the pace and to invite dialogue, often by using technology enhanced learning. For example, I devoted an hour to tackling an exam-style problem, with students actively working to solve the problem using the knowledge gained via the flipped learning videos and previous live lectures. I used the applications Mentimeter, Socrative and Kahoot to interact with the students, asking them multiple-choice questions, encouraging them to vote on questions and to create word clouds of their initial thoughts on tackling problem questions as we progressed.

Evaluation

I evaluated reaction to the module using the usual formal and informal module evaluations. I also tracked engagement with the videos and actively used these figures to prompt students if views were lower than expected. I monitored attendance to modules and didn’t notice any drop-off in attendance. Finally, I reviewed end of year results to assess impact on students results.

Impact

Student feedback, about the videos and problem solving, was overwhelmingly positive in both formal and informal module evaluations.

Videos can be of assistance if a student is absent, has a disability or wishes to revisit the material. Sankoff (2014) and Billings-Gagliardi and Mazor (2007) dismiss concerns about reduced student attendance due to online material, and this was borne out by my experience, with no noticeable drop-off in numbers attending lectures; I interpret this as a positive sign of student satisfaction. The videos worked to supplement the live lectures rather than replace them.

There is a clear, positive impact on my own workload and that of my colleagues. Whilst I am no longer teaching on this module, my successor has been able to use my videos again in her teaching, thereby reducing her own workload. I have also been able to re-use some of the videos in other modules.

Reflections

Whilst flipped learning is intensive to plan, create and execute, the ability to re-use the videos in multiple modules is a huge advantage; short videos are simple to re-record if, and when, updating is required.

My initial concern that students would not watch the videos was utterly misplaced. Each video has had in excess of 1200 views (and one video has exceeded 2500). Some of the material was only covered by the flipped learning videos, and still appeared within the examination; students who tackled those questions did equally well as those answering questions covering content which was given via live lecture, but those questions were less popular (2017/18 examination).

I was conscious that there may be some students who would just ignore the videos, thereby missing out on chunks of the syllabus, I tried to mitigate this by running quizzes during lectures on the recorded material, and offering banks of multiple choice questions (MCQs) on Blackboard for students to test their knowledge (aligned to the summative examination which included a multiple choice section). In addition, I clearly signposted the importance of the video recorded material by email, on the Blackboard page and orally and emphasised that it would form part of the final examination and could not be ignored.

My experience echoes that of Schaffzin’s study (2016) monitoring impact, which showed no statistical significance in law results having instituted flipped learning, although she felt that it was a more positive teaching method. Examination results for the module in the end of year summative assessment (100% examination) were broadly consistent with the results in previous academic years, but student satisfaction was higher, with positive feedback about the use of videos and active learning activities.

Follow Up

Since creating the flipped learning videos another colleague has taken over as convenor and continued to use the videos I created. Some of the videos have also been able to be used in other modules.  I have used screencast videos in another non-law module, and also used them as introductory material for a large core Part 1 Law module. Student feedback in module evaluations praised the additional material. One evolution in another module was that when I ran out of time to cover working through a past exam question within a lecture, I created a quick screencast which finished off the topic for students; I felt that it was better to go at a more sensible pace in the lecture and use the screencast rather than rush through the material.

Michelle Johnson, Module Convenor 2018-2019 commented that:

“I have continued to use and expand the flipped learning initiative as part of the module and have incorporated further screencasts into the module in relation to the contract law content delivered. This allowed for additional time on the module to conduct a peer-assessment exercise focussed on increasing the students’ direct familiarity with exam questions and also crucially the marking criteria that would be used to score their Summer exams. Students continue to be very positive about the incorporation of flipped learning material on the module and I feel strongly that it allowed the students to review the more basic introductory content prior to lectures, this allowing time for a deeper engagement with the more challenging aspects of the lectures during lecture time. This seemed to improve students understanding of the topics more broadly, allowing them to revisit material whenever they needed and in a more targeted way than a simple lecture recording.”

TEF

TQ1, LE1, SO3

Links

University of Reading TEL advice about personal capture – https://sites.reading.ac.uk/tel-support/category/learning-capture/personal-capture

Berrett, D. (2012). How “Flipping” the Classroom Can Improve the Traditional Lecture. – https://www.chronicle.com/article/how-flipping-the-classroom/130857. Chronicle of Higher Education..

Billings-Gagliardi, S and Mazor, K. (2007) Student decisions about lecture attendance: do electronic course materials matter?. Academic Medicine: Journal of the Association of American Medical Colleges, 82(10), S73-S76.

Sankoff, P. (2014) Taking the Instruction of Law outside the Lecture Hall: How the Flipped Classroom Can Make Learning More Productive and Enjoyable (for Professors and Students), 51, Alberta Law Review, pp.891-906.

Schaffzin, K. (2016) Learning Outcomes in a Flipped Classroom: A comparison of Civil Procedure II Test Scores between Students in a Traditional Class and a Flipped Class, University of Memphis Law Review, 46, pp. 661.

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 2)

Dr Madeleine Davies and Michael Lyons, School of Literature and Languages

Overview

The Department of English Literature (DEL) has run two student focus groups and two whole-cohort surveys as part of our Teaching and Learning Development Fund‘Diversifying Assessments’ project. This is the second of two T&L Exchange entries on this topic. Click here for the first entry which outlines how the feedback received from students indicates that their module selection is informed by the assessment models that are used by individual modules. Underpinning these decisions is an attempt to avoid the ‘stress and anxiety’ that students connect with exams. The surprise of this second round of focus groups and surveys is the extent to which this appears to dominate students’ teaching and learning choices.

Objectives

  • The focus groups and surveys are used to gain feedback from DEL students about possible alternative forms of summative assessment to our standard assessed essay + exam model. This connects with the Curriculum Framework in its emphasis on Programme Review and also with the aims of the Assessment Project.
  • These forms of conversations are designed to discover student views on the problems with existing assessment patterns and methods, as well as their reasons for preferring alternatives to them.
  • The conversations are also being used to explore the extent to which electronic methods of assessment can address identified assessment problems.

Context

Having used focus groups and surveys to provide initial qualitative data on our assessment practices, we noticed a widespread preference for alternatives to traditional exams (particularly the Learning Journal), and decided to investigate the reasons for this further. The second focus group and subsequent survey sought to identify why the Learning Journal in particular is so favoured by students, and we were keen to explore whether teaching and learning aims were perceived by students to be better achieved via this method than by the traditional exam. We also took the opportunity to ask students what they value most in feedback: the first focus group and survey had touched on this but we decided this time to give students the opportunity to select four elements of feedback which they could rank in order or priority. This produced more nuanced data.

Implementation

  • A second focus group was convened to gather more detailed views on the negative attitudes towards exams, and to debate alternatives to this traditional assessment method.
  • A series of questions was asked to generate data and dialogue.
  • A Survey Monkey was circulated to all DEL students with the same series of questions as those used for the focus group in order to determine whether the focus group’s responses were representative of the wider cohort.
  •  The Survey Monkey results are presented below. The numbers refer to student responses to a category (eg. graphic 1, 50 students selected option (b). Graphic 2 and graphic 5 allowed students to rank their responses in order or priority.

Results

  • Whilst only 17% in the focus group preferred to keep to the traditional exam + assessed essay method, the survey found the aversion to exams to be more prominent. 88% of students preferred the Learning Journal over the exam, and 88% cited the likelihood of reducing stress and anxiety as a reason for this preference.
  • Furthermore, none of the survey respondents wanted to retain the traditional exam + assessed essay method, and 52% were in favour of a three-way split between types of assessment; this reflects a desire for significant diversity in assessment methods.
  • We find it helpful to know precisely what students want in terms of feedback: ‘a clear indication of errors and potential solutions’ was the overwhelming response. ‘Feedback that intersects with the Module Rubric’ was the second highest scorer (presumably a connection between the two was identified by students).
  • The students in the focus group mentioned a desire to choose assessment methods within modules on an individual basis. This may be one issue in which student choice and pedagogy may not be entirely compatible (see below).
  • Assessed Essay method: the results seem to indicate that replacing an exam with a second assessed essay is favoured across the Programme rather than being pinned to one Part.

Reflections

The results in the ‘Feedback’ sections are valuable for DEL: they indicate that clarity, diagnosis, and solutions-focused comments are key. In addressing our feedback conventions and practices, this input will help us to reflect on what we are doing when we give students feedback on their work.

The results of the focus group and of the subsequent survey do, however, raise some concerns about the potential conflict between ‘student choice’ and pedagogical practice. Students indicate that they not only want to avoid exams because of ‘stress’, but that they would also like to be able to select assessment methods within modules. This poses problems because marks are in part produced ‘against’ the rest of the batch: if the ‘base-line’ is removed by allowing students to choose assessment models, we would lack one of the main indicators of level.

In addition, the aims of some modules are best measured using exams. Convenors need to consider whether a student’s work can be assessed in non-exam formats but, if an exam is the best test of teaching and learning, it should be retained, regardless of student choice.

If, however, students overwhelmingly choose non-exam-based modules, this would leave modules retaining an exam in a vulnerable position. The aim of this project is to find ways to diversify our assessments, but this could leave modules that retain traditional assessment patterns vulnerable to students deselecting them. This may have implications for benchmarking.

It may also be the case that the attempt to avoid ‘stress’ is not necessarily in students’ best interests. The workplace is not a stress-free zone and it is part of the university’s mission to produce resilient, employable graduates. Removing all ‘stress’ triggers may not be the best way to achieve this.

Follow up

  • DEL will convene a third focus group meeting in the Spring Term.
  • The co-leaders of the ‘Diversifying Assessments’ project will present the findings of the focus groups and surveys to DEL in a presentation. We will outline the results of our work and call on colleagues to reflect on the assessment models used on their modules with a view to volunteering to adopt different models if they think this appropriate to the teaching and learning aims of their modules
  • This should produce an overall assessment landscape that corresponds to students’ request for ‘three-way’ (at least) diversification of assessment.
  • The new landscape will be presented to the third focus group for final feedback.

Links

With thanks to Lauren McCann of TEL for sending me the first link which includes a summary of students’ responses to various types of ‘new’ assessment formats.

https://www.facultyfocus.com/articles/online-education/assessment-strategies-students-prefer/

Conclusions (May 2018)

The ‘Diversifying Assessment in DEL’ TLDF Mini-Project revealed several compelling reasons for reflecting upon assessment practice within a traditional Humanities discipline (English Literature):

  1. Diversified cohort: HEIs are recruiting students from a wide variety of socio-cultural, economic and educational backgrounds and assessment practice needs to accommodate this newly diversified cohort.
  2. Employability: DEL students have always acquired advanced skills in formal essay-writing but graduates need to be flexible in terms of their writing competencies. Diversifying assessment to include formats involving blog-writing, report-writing, presentation preparation, persuasive writing, and creative writing produces agile students who are comfortable working within a variety of communication formats.
  3. Module specific attainment: the assessment conventions in DEL, particularly at Part 2, have a standardised assessment format (33% assessed essay and 67% exam). The ‘Diversifying Assessment’ project revealed the extent to which module leaders need to reflect on the intended learning outcomes of their modules and to design assessments that are best suited to the attainment of them.
  4. Feedback: the student focus groups convened for the ‘Diversifying Assessment’ project returned repeatedly to the issue of feedback. Conversations about feedback will continue in DEL, particularly in relation to discussions around the Curriculum Framework.
  5. Digitalisation: eSFG (via EMA) has increased the visibility of a variety of potential digital assessment formats (for example, Blackboard Learning Journals, Wikis and Blogs). This supports diversification of assessment and it also supports our students’ digital skills (essential for employability).
  6. Student satisfaction: while colleagues should not feel pressured by student choice (which is not always modelled on academic considerations), there is clearly a desire among our students for more varied methods of assessment. One Focus Group student argued that fees had changed the way students view exams: students’ significant financial investment in their degrees has caused exams to be considered unacceptably ‘high risk’. The project revealed the extent to which Schools need to reflect on the many differences made by the new fees landscape, most of which are invisible to us.
  7. Focus Groups: the Project demonstrated the value of convening student focus groups and of listening to students’ attitudes and responses.
  8. Impact: one Part 2 module has moved away from an exam and towards a Learning Journal as a result of the project and it is hoped that more Part 2 module convenors will similarly decide to reflect on their assessment formats. The DEL project will be rolled out School-wide in the next session to encourage further conversations about assessment, feedback and diversification. It is hoped that these actions will contribute to Curriculum Framework activity in DEL and that they will generate a more diversified assessment landscape in the School.

Rethinking assessment design, to improve the student/staff experience when dealing with video submissions

Rachel Warner, School of Arts and Communication Design

Rachel.Warner@pgr.reading.ac.uk

Jacqueline Fairbairn, Centre for Quality Support and Development

j.fairbairn@reading.ac.uk

Overview

Rachel in Typography and Graphic Communication (T&GC) worked with the Technology Enhanced Learning (TEL) team to rethink an assignment workflow, to improve the student/staff experience when dealing with video submissions. Changes were made to address student assessment literacies, develop articulation skills, support integration between practice and reflection, and make use of OneDrive to streamline the archiving and sharing of video submissions via Blackboard.

This work resulted in students developing professional ‘work skills’ through the assessment process and the production of a toolkit to support future video assessments.

Objectives

  • Improve staff and student experiences when dealing with video assignment submissions. Specifically, streamlining workflows by improving student assessment literacy and making use of university OneDrive accounts.
  • Support students to develop professional skills for the future, through assessment design (developing digital literacies and communication skills).
  • Provide an authentic assessment experience, in which students self-select technologies (choosing software and a task to demonstrate) to answer a brief.

Context

The activity was undertaken for Part 1 students learning skills in design software (e.g. Adobe Creative apps). The assignment required students to submit a ‘screencast’ video recording that demonstrated a small task using design software.

Rachel wanted to review the process for submitting video work for e-assessment, and find ways to streamline the time intensive marking process, particularly in accessing and reviewing video files, without compromising good assessment practice. This is also acknowledged by Jeanne-Louise Moys, T&GC’s assessment and feedback champion: “Video submissions help our students directly demonstrate the application of knowledge and creative thinking to their design and technical decisions. They can be time-consuming to mark so finding ways to streamline this process is a priority given our need to maintain quality practices while adapting to larger cohorts.’”

The TEL team was initially consulted to explore processes for handling video submissions in Blackboard, and to discuss implications on staff time (in terms of supporting students, archiving material and accessing videos for marking). Designing formative support and improving the assessment literacy of students was also a key driver to reduce the number of queries and technical issues when working with video technologies.

Implementation

Rachel consulted TEL, to discuss:

  • balancing the pedagogic implications of altering the assignment
  • technical implications, such as submission to Blackboard and storage of video

To address the issue of storing video work, students were asked make use of OneDrive areas to store and submit work (via ‘share’ links). Use of OneDrive encouraged professional behaviours such as adopting a systematic approach to file naming, and it meant the videos were securely stored on university systems using a well-recognised industry standard platform.

To further encourage professional working, students were required to create a social media account to share their video. YouTube was recommended; it is used prolifically by designers to showcase work and portfolios, and across wider professional settings.

Students were provided with a digital coversheet to submit URLs for both the OneDrive and YouTube videos.

The most effective intervention was the introduction of a formative support session (1.5hr). Students practiced using their OneDrive area, set up YouTube accounts and reviewed examples of screencasts. This workshop supported students to understand the professional skills that could be developed through this medium. The session introduced the assessment requirements, toolkit, digital coversheet and allowed students to explore the technologies in a supported manner (improving students’ assessment literacy!)

The assignment instructions were strategically revised, to include information (‘hints and tips’) to support the students’ development of higher production values and other associated digital literacies for the workplace (such as file naming conventions, digital workflows, and sourcing online services).

Students were provided with the option to self-select recording/editing software to undertake the screencast video. Recommended tools were suggested, that are free to use and which students could explore. ‘Screencast-o-matic’ and ‘WeVideo’ provide basic to intermediate options.

Impact

Marking the submissions was made easier by the ability to access videos through a consistent format, using a clearly structured submission process (digital coversheet). The ability to play URL links directly through OneDrive meant Rachel was able to store copies of the videos into a central area for future reference. Students also provided a written summary of their video, highlighting key video timings that demonstrate marking criteria (so the marker does not have to watch whole video).

Rachel rationalised her approach to marking by developing a spreadsheet, which allowed her to effectively cross reference feedback against the assessment criteria (in the form of a rubric) and between assignments. This greatly speeded up the marking workflow and allowed Rachel to identify patterns in students work, where common feedback statements could be applied, as appropriate.

The assessment highlighted gaps in students existing digital literacies. The majority of students had not made a video recording before and many were apprehensive about speaking into a microphone. After the completion of the screencasts, previously unconfident students noted in their module reflections that the screencast task had developed their confidence to communicate and explore a new technology.

Reflections

The modifications to the assessment:

  • Reflected professional digital competencies required of the discipline;
  • Allowed students to explore a new technology and way of working in a supported context; and,
  • Built confidence, facilitated assessment literacy, and encouraged reflection.

Future modifications to the screencast submission:

  • Peer review could be implemented, asking students to upload videos to a shared space for formative feedback (such as Facebook or a Blackboard discussion board).
  • The digital coversheet had to be downloaded to access URL links. In future, students could paste into the submission comment field, for easier access when marking.
  • Rachel is developing a self-assessment checklist to help students reflect on the production values of their work. The summative assessment rubric is focused on video content, not production values, however, it would be useful for students to get feedback on professional work skills. For example, communication skills and use of narrative devices which translate across other graphic mediums.

Toolkit basics:

a thumbnail image of a toolkit document, full access available via links in webpage

  • Outline task expectations and software options, give recommendations
  • Source examples of screencasts from your industry, discuss with students.
  • Provide hints and tips for creating effective screencasts.
  • Provide submission text. Consider asking students to use the ‘submission comment’ field to paste links to their work, for quick marker access to URLs.
  • Plan a formative workshop session, to practice using the software and go through the submission process (time invested here is key!).
  • Create a self-assessment checklist, to enhance the production quality of videos and highlight transferrable skills that can be developed by focusing on the quality of the production.
  • Consider creating a shared online space for formative peer-feedback (e.g. Blackboard discussion forum).
  • Consider using a marking spreadsheet to cross-reference feedback and highlight good examples of screencasts that can be utilised in other teaching.

Links

Screencast example: (YouTube link) This screencast was altered and improved after submission and marking, taking onboard feedback from the assessment and module. The student noted ‘After submission, I reflected on my screencast, and I changed the original image because it was too complex to fit into the short time that I had available in the screencast. I wanted to use the screencast to show a skill that I had learned and the flower was simple enough to showcase this’. Part of the module was to be reflective and learn from ‘doing’, this screencast is an example of a student reflecting on their work and improving their skills after the module had finished.

Screencast example: (YouTube link) This screencast was a clear and comprehensive demonstration of a technique in PhotoShop that requires multiple elements to achieve results. It has a conclusion that demonstrates the student’s awareness that the technique is useful in other scenarios, other than the one demonstrated, giving the listener encouragement to continue learning. The student has used an intro slide and background music, demonstrating exploration with the screencast software alongside compiling their demonstration.

Screencast example: (YouTube link) This demonstrates a student who is competent in a tool, able to use their own work (work from another module on the course) to demonstrate a task, and additionally includes their research into how the tool can be used for other tasks.

Other screencast activity from the Typography & Graphic Communication department from the GRASS project:  (Blog post) Previous project for Part 1s that included use of screencasts to demonstrate students’ achievements of learning outcomes.

Engaging students in assessment design

Dr Maria Kambouri-Danos, Institute of Education

m.kambouridanos@reading.ac.uk

Year of activity 2016/17

Overview

This entry aims to share the experience of re-designing and evaluating assessment in collaboration with students. It explains the need for developing the new assessment design and then discusses the process of implementing and evaluating its appropriateness. It finally reflects on the impact of MCQ tests, when assessing students in higher education (HE), and the importance of engaging students as partners in the development of new assessment tools.

Objectives

  • To re-design assessment and remove a high-stakes assessment element.
  • To proactively engage ‘students as partners’ in the development and evaluation of the new assessment tool.
  • To identify the appropriateness of the new design and its impact on both students and staff.

Context

Child Development (ED3FCD) is the core module for the BA in Children’s Development and Learning (BACDL), meaning that a pass grade must be achieved on the first submission to gain a BA Honours degree classification (failing leads to an ordinary degree). The assessment needed to be redesigned as it put the total weight of students’ mark on one essay. As the programme director, I wanted to engage the students in the re-design process and evaluate the impact of the new design on both students and staff.

Implementation

After attending a session on ‘Effective Feedback: Ensuring Assessment and Feedback works for both Students and Staff Across a Programme’ I decided to explore more the idea of using Multiple Choice Tests (MCQ). To do so, I attended a session on ‘Team Based Learning (TBL)’ and another on ‘MCQ: More than just a Test of Information Recall’, to gather targeted knowledge about designing effective MCQ questions.

I realised that MCQ tests can help access students’ understanding and knowledge and also stimulate students’ active and self-managed learning. Guided by the idea of ‘assessment for learning’, I proposed the use of an MCQ test during a steering group meeting (employees and alumni) and a Board of Studies (BoS) meeting, which 2nd year Foundation Degree as well as BACDL student representatives attended. The idea was resisted initially, as MCQ tests are not traditionally used in HE education departments. However, after exploring different options and highlighting the advantages of MCQ tests, the agreement was unanimous. At the last BoS meeting (2016), students and staff finalised the proposal for the new design, proposing to use the MCQ test for 20% of the overall mark, keeping the essay for the remaining 80%.

At the beginning of 2017, I invited all BACDL students to anonymously post their thoughts and concerns about the new design (and the MCQ test) on Padlet. Based on these comments, I then worked closely with the programme’s student representatives and had regular meetings to discuss, plan and finalise the assessment design. We decided how to calculate the final mark (as the test was completed individually and then in a group) as well as the total number of questions, the duration of the test, etc.  A pilot study was then conducted during which a sample MCQ test was shared with all the students, asking them to practise and then provide feedback. This helped to decide the style of the questions used for the final test, an example of which is given below:

There are now more than one million learners in UK schools who speak English as an additional language (EAL). This represents a considerable proportion of the school population, well above 15 per cent. To help EAL children develop their English, teachers should do all the following, except…

a. use more pictures and photographs to help children make sense of new information.

b. use drama and role play to make learning memorable and encourage empathy.

c. maintain and develop the child’s first language alongside improving their English.

d. get children to work individually because getting them into groups will confuse them and make them feel bad for not understanding.

e. provide opportunities to talk before writing and use drills to help children memorise new language.

Impact

Students were highly engaged in the process of developing the new design, and the staff-student collaboration encouraged the development of bonds within the group. The students were excited with the opportunity to actively develop their own course and the experience empowered them to take ownership of their own learning. All of them agreed that they felt important and as a student representative said, “their voices were heard”.

The new design encouraged students to take the time to gauge what they already know and identify their strengths and weaknesses. Students themselves noted that the MCQ test helped them to develop their learning as it was an additional study opportunity. One of them commented that “…writing notes was a good preparation for the exam. The examination was a good learning experience.” Staff also agreed that the test enabled students to (re)evaluate their own performance and enhance their learning. One of the team members noted that the “…test was highly appropriate for the module as it offered an opportunity for students to demonstrate their proficiency against all of the learning outcomes”.

Reflections

The new assessment design was implemented successfully because listening to the students’ voice and responding to their feedback was an essential part of the designing process. Providing opportunities to both students and staff to offer their views and opinions and clearly recognising and responding to their needs were essential, as these measures empowered them and helped them to take ownership of their learning.

The BACDL experience suggests that MCQ tests can be adapted and used for different subject areas as well as to measure a great variety of educational objectives. Their flexibility means that they can be used for different levels of study or learning outcomes, from simple recall of knowledge to more complex levels, such as the student’s ability to analyse phenomena or apply principles to new situations.

However, good MCQ tests take time to develop. It is hoped that next year the process of developing the test will be less time-consuming as we already have a bank of questions that we could use. This will enable randomisation of questions which will also help to avoid misconduct. We are also investigating options that would allow for the test to be administered online, meaning that feedback could be offered immediately, reducing even further the time/effort required to mark the test.

Follow up

MCQ tests are not a panacea; just like any other type of assessment tool, MCQ tests have advantages and limitations. This project has confirmed that MCQ tests are adaptable and can be used for different subject areas as well as to measure a great variety of educational objectives. The evaluation of the assessment design will continue next year and further feedback will be collected by the cohort and next year’s student representatives.

Independent research and research dissemination in undergraduate teaching

Dr. Ute Woelfel, Literature and Languages
u.wolfel@reading.ac.uk
Year of activity: 2016/17

Overview

In order to improve students’ engagement, support their abilities as independent learners, and increase their feeling of ownership for their academic work, elements of independent research and research dissemination through the creation of research posters were included in a Part 2 module.

Objectives

  • Boost independent learning.
  • Nurture research interests.
  • Increase feeling of ownership.
  • Develop employability skills.

Context

In 2016/17 I introduced a new Part Two module on German National Cinema (GM2CG: 20 credits/ 30 contact hours). The module is intended to give students a general overview of German cinema from the end of World War I to German unification and at the same time allow sustained independent work on themes of interest. In order to increase the engagement with the themes, the independent work is research-oriented demanding from students to reflect their own expectations and aims, their goals for the module and indeed the course, and develop their own interest and approach.

Implementation

The students were asked in the beginning to pick a period or topic from a list and prepare a presentation. The presentation was not part of the summative assessment but served as a foundation for further research. After the presentation, individual discussions with each student were used to decide which aspect of the theme/topic the student would like to pursue further. After each term, essay surgeries were offered in which students were given the opportunity to discuss the research done so far and decide a concrete research question for their essay (2,500 words/ 30%). The students were then asked to turn the findings of their essays into research posters for dissemination to non-specialist audiences (10%). In order to make sure that students also gain a general understanding of German cinema, a final exam (60%) is scheduled in the summer term.

Impact

The inclusion of independent research elements was very successful in that students did engage more than they normally do when given set topics and essay titles. The majority of students found secondary sources, even additional primary sources, and often identified research topics they would like to pursue in the future. Both the essay and the exam marks were above average. The poster challenged students to re-think their academic findings and present them in a new, visually organised, format for interested general audiences; as we used the posters to showcase the students’ work at the University’s Languages Festival, the Visit Days and a Reading Scholars outreach event, a sense of the importance of their work emerged as well as pride in what they had achieved grew. The students understood the relevance of the poster for the development of professional skills.

Reflections

The module worked well and highlighted most of all the potential our students have and can develop in the right learning environment as well as their willingness to work hard when they are committed. Their engagement with independent research signalled a wish to get active and explore options beyond the set class texts rather than being spoon-fed; there is a clear need for feeling involved, responsible and in charge of work. I was particularly surprised about how much effort students were prepared to put into the presentations despite the fact that they did not count towards the module mark; as they were used as foundation for assessment, students clearly understood their benefit.

The research elements made the module learning and teaching intensive as a good number of office hours and slots during the enhancement weeks were used for individual discussions of research and essay topics; as I want the students to put their research posters to good use as well, additional feedback slots were offered in which I discussed not just marks but ways of improving the posters; students showed great willingness to work even further on their posters just to see them exhibited, despite the fact that any further input would not change the mark.

Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 1)

Dr Madeleine Davies, School of Literature and Languages

Overview

The Department of English Literature (DEL) is organising student focus groups as part of our TLDF-funded ‘Diversifying Assessments’ project led by Dr Chloe Houston and Dr Madeleine Davies. This initiative is in dialogue with Curriculum Framework emphases engaging students in Programme Development and involving them as stakeholders. This entry outlines the preparatory steps taken to set up our focus groups, the feedback from the first meeting, and our initial responses to it.

Objectives

  • To involve students in developing a more varied suite of assessment methods in DEL.
  • To hear student views on existing assessment patterns and methods.
  • To gather student responses to electronic methods of assessment (including learning journals, blogs, vlogs and wikis).

Context

We wanted to use Curriculum Framework emphases on Programme Review and Development to address assessment practices in DEL. We had pre-identified areas where our current systems might usefully be reviewed and we decided to use student focus groups to provide valuable qualitative data about our practices so that we could make sure that any changes were informed by student consultation.

Implementation

I attended a People Development session ‘Conducting Focus Groups’ to gather targeted knowledge about setting up focus groups and about analytical models of feedback evaluation. I also attended a CQSD event, ‘Effective Feedback: Ensuring Assessment and Feedback works for both Students and Staff Across a Programme’, to gain new ideas about feedback practice.

I applied for and won TLDF mini-project funding to support the Diversifying Assessments project. The TLDF funding enabled us to regard student focus groups as a year long consultative process, supporting a review of assessment models and feedback practices in DEL.

In Spring Term 2017, I emailed our undergraduate students and attracted 11 students for the first focus group meeting. We aim to include as diverse a range of participants as possible in the three planned focus group meetings in 2016-17. We also aim to draw contributors from all parts of the undergraduate programme.

To prepare the first focus group:

  • I led a DEL staff development session on the Diversifying Assessment project at the School of Literature and Languages’ assessment and feedback away day; this helped me to identify key questions and topics with colleagues.
  • I conducted a quantitative audit of our assessment patterns and I presented this material to the staff session to illustrate the nature of the issues we aim to address. This tabulated demonstration of the situation enabled colleagues to see that the need for assessment and feedback review was undeniable.

At the first focus group meeting, topics and questions were introduced by the two project leaders and our graduate intern, Michael Lyons, took minutes. We were careful not to approach the group with clear answers already in mind: we used visual aids to open conversation (see figures 1 and 2) and to provide the broad base of key debates. We also used open-ended questions to encourage detail and elaboration.

Group discussion revealed a range of issues and opinions that we would not have been able to anticipate had we not held the focus group:

  • Students said that a module’s assessment pattern was the key determinant in their selection of modules.
  • Some students reported that they seek to avoid exams where possible at Part Two.
  • Discussing why they avoid exams, students said that the material they learn for exams does not ‘stick’ in the same way as material prepared for assessed essays and learning journals so they feel that exams are less helpful in terms of learning. Some stated that they do not believe that exams offer a fair assessment of their work.
  • Students wholly supported the use of learning journals because they spread the workload and because they facilitate learning. One issue the students emphasised, however, was that material supporting learning journals had to be thorough and clear.
  • Presentations were not rated as highly as a learning or assessment tool, though a connection with employability was recognised.
  • Assessed essays were a popular method of assessment: students said they were proud of the work they produced for summative essays and that only ‘bunched deadlines’ caused them problems (see below). This response was particularly marked at Part Two.
  • Following further discussion it emerged that our students had fewer complaints about the assessment models we used, or about the amount of assessment in the programme, than they did about the assessment feedback. This is represented below:

To open conversation, students placed a note on the scale. The question was, ‘Do we assess too much, about right, not enough?’ (‘About right’ was the clear winner).

Students placed a note on the scale: the question was, ‘Do we give you too much feedback, about right, or too little?’ (The responses favoured the scale between ‘about right’ and ‘too little’.)


The results of this exercise, together with our subsequent conversation, helped us to understand the importance of feedback to the Diversifying Assessment project; however, subsequent to the focus group meeting, the DEL Exams Board received an excellent report from our External Examiners who stated that our feedback practices are ‘exemplary’. We will disseminate this information to our students who, with no experience of feedback practices other than at the University of Reading, may not realise that DEL’s feedback is regarded as an example of best practice by colleagues from other institutions. We are also considering issuing our students with updates when assessed marking is underway so that they know when to expect their marks, and to demonstrate to them that we are always meeting the 15-day turnaround. The external examiners’ feedback will not, however, prevent us from continuing to reflect on our feedback processes in an effort to enhance them further.

Following the focus group meeting, we decided to test the feedback we had gathered by sending a whole cohort online survey: for this survey, we changed the ‘feedback’question slightly to encourage a more detailed and nuanced response. The results, which confirmed the focus group findings, are represented below (with thanks to Michael Lyons for producing these graphics for the project):

A total of 95 DEL students took part in the survey. 87% said they valued the opportunity to be assessed with diverse methods.

Assessed essays were the most popular method of assessment, followed by the learning journal. However, only a small proportion of students have been assessed with a learning journal, meaning it is likely that a high percentage of those who have been assessed this way stated it to be their preferred method of assessment.

On a scale from 0-10 (with 0 being too little, 5 about right, and 10 too much), the students gave an average score of 5.1 for the level of assessment on their programmes with 5 being both the mode and the median scores.

34% found the level of detail covered most useful in feedback, 23% the feedback on writing style, 16% the clarity of the feedback, and 13% its promptness. 7% cited other issues (e.g. ‘sensitivity’) and 7% did not respond to this question.

66% said they always submit formative essays, 18% do so regularly, 8% half of the time, 4% sometimes, and 4% never do.

40% said they always attend essay supervisions (tutorials) for their formative essays, 14% do so regularly, 10% half of the time, 22% sometimes, and 14% never do.

Impact

The focus group conversation suggested that the area on which we need to focus in DEL, in terms of diversification of assessment models, is Part Two assessment provision because Part One and Part Three already have more diversified assessments. However, students articulated important concerns about the ‘bunching’ of deadlines across the programme; it may be that we need to consider the timing of essay deadlines as much as we need to consider the assessment models themselves. This is a conversation that will be carried forward into the new academic year.

Impact 1: Working with the programme requirement (two different types of assessment per module), we plan to move more modules away from the 2000 word assessed essay and exam model that 80% of our Part Two modules have been using. We are now working towards an assessment landscape where, in the 2017-18 academic session, only 50% of Part Two modules will use this assessment pattern. The others will be using a variety of assessment models potentially including learning journals and assessed essays: assessed presentations and assessed essays: vlogs and exams: wikis, presentations and assessed essays: blogs and 5000 word module reports.

Impact 2: We will be solving the ‘bunched’ deadlines problem by producing an assessments spread-sheet that will plot each assessment point on each module to allow us to retain an overview of students’ workflow and to spread deadlines more evenly.

Impact 3: The next phase of the project will focus on the type, quality and delivery of feedback. Prior to the Focus Group, we had not realised how crucial this issue is, though the External Examiners’ 2017 report for DEL suggests that communication may be the more crucial factor in this regard. Nevertheless, we will disseminate the results of the online survey to colleagues and encourage more detail and more advice on writing style in feedback.

Anticipated impact 4: We are expecting enhanced attainment as a result of these changes because the new assessment methods, and the more even spread of assessment points, will allow students to present work that more accurately reflects their ability. Further, enhanced feedback will provide students with the learning tools to improve the quality of their work.

Reflections

Initially, I had some reservations about whether student focus groups could give us the reliable data we needed to underpin assessment changes in DEL. However, the combination of quantitative data (via the statistical audit I undertook and the online survey) and qualitative data (gathered via the focus groups and again by the online survey) has produced a dependable foundation. In addition, ensuring the inclusion of a diverse range of students in a focus group, drawn from all levels of the degree and from as many communities as possible within the cohort, is essential for the credibility of the subsequent analysis of responses. Thorough reporting is also essential as is the need to listen to what is being said: we had not fully appreciated how important the ‘bunched deadlines’, ‘exams’, and ‘feedback’ issues were to our students. Focus groups cannot succeed unless those convening them respond proactively to feedback.

Follow up

There will be two further DEL student focus group meetings, one in the Autumn Term 2017 (to provide feedback on our plans and to encourage reflection in the area of feedback) and one in the Spring Term 2018 (for a final consultation prior to implementation of new assessment strategies). It is worth adding that, though we have not yet advertised the Autumn Term focus group meeting, 6 students have already emailed me requesting a place on it. There is clearly an appetite to become involved in our assessment review and student contribution to this process has already revealed its value in terms of teaching and learning development.

Game-based learning using social media

Dr Stanimira Milcheva, Henley Business School
stani.milcheva@henley.reading.ac.uk
Year of activity: 2015/16

Overview

We designed a simple game (called the REFinGame) which was aligned with the course material and launched it on Facebook. This approach, which could easily be applied to other discipline areas, was successfully used to enhance student learning and engagement with modules related to real estate finance.

Objectives

  • Allow students to develop transferable skills.
  • Allow students to apply course material in a real-world scenario.
  • Provide immediate and personalised feedback.
  • Improve interactions among students and between students and the lecturer.
  • Help make the module more interactive and enjoyable for students.

Context

Real Estate Finance and Debt Markets (REMF41), is a master’s module within Henley Business School. During the module students gain an awareness of the financing process for real estate from both a borrower’s and a lender’s point of view. The game was designed so that students could apply course material and learn to assess the risks associated with financing decisions.

Implementation

First, together with Professor Charles Ward, the REFinGame was designed before the beginning of the module. The design had to take into account the course material and make simplifying assumptions so that the game could be modelled to best represent reality. The idea was that students would play the game over the course of the module outside the classroom. The game is about making financing decisions. Students are split into property developers (investors) and lenders (banks). The developers make decisions on how many properties to develop depending on how much money they have and how much finding they need from the bank. Moreover, they decide on the type of the properties, the location and other characteristics. The banks decide how much funding to provide to each developer. The game is played on Facebook on a weekly basis as information is introduced on the Facebook Wall each week. Students advertise properties on the Wall, and a decision is made by the game coordinator on the transaction price of the buildings, based on the total supply by developers and the macroeconomic situation in that period. The main idea is that students learn to assess the risks associated with financing decisions as they can lose the virtual money they have available by making the wrong decisions. The game is won by the student who accumulates the greatest amount of money.

A closed Facebook group was created for the module, a logo was created for the game, and students were briefed how to play the game. The developers and lenders had to negotiate loan conditions using Facebook messages. They then advertised the properties they developed by putting pictures and information on the Wall. The purchase prices are then communicated to the developers by private message. Information about the economy and the markets us distributed as a post on the Wall. Students have to fill in a spreadsheet each week and send this to the game instructor. The game instructor then provides feedback to each student. At the end of the game, students shared their experience of the game by giving a presentation in which they presented their strategy and performance throughout the game and compared it to their peers. These presentations are assessed.

Impact

A significant relationship was found between the students who performed well in the game and their overall module mark. Less tangible outcomes are that the game can help students develop skills such as problem solving, creativity, and strategic behaviour, and also increases the interaction among students and between the students and the lecturer. In particular we found that playing a game on Facebook helped to better integrate students who might be more reticent in class discussions. The lecturer can develop a better idea of each student’s performance leading to students receiving tailored and regular feedback and being able to improve throughout the game. This is one of the main advantages that students identified, along with the playfulness of the game, and the ease with which the game is played on Facebook. The major issues students faced were the perception that course material is not directly applied in the game. This demonstrates that it is important to manage student expectations as well as have a structured approach when it comes to game design. Ultimately, our goal is to create guidelines for using self-designed simple games incorporating Facebook, and improve student learning.

Reflection

The novelty of our approach is that we did not design a video game or a digital game using special software, but instead designed a simple game to be played online using Facebook as a platform. We wanted to show how with limited resources and time an instructor can construct a game and engage students with it, as Facebook is free and widely used by students. We have observed that the main challenge in the design of the game is to ensure that it aligns with the course material and to manage student expectations. For this purpose the instructor should very clearly explain how the game can benefit the students and how they will be assessed. Also, it is crucial to communicate how the course material can be used within the game to make decisions. For this purpose, the game designer needs to make sure that the students see the direct link between the course material and the learning outcomes of the game.

Virtual teaching collections in Archaeology and Classics: turning artefacts into 3D models

Dr Robert Hosfield, School of Archaeology, Geography and Environmental Science

r.hosfield@reading.ac.uk

Year of activity: 2015/16

Sample image

Lykethos

Overview

The project tested different methods for producing and disseminating 3D models of existing artefacts in the teaching collections of Classics and Archaeology. 3D scanning was labour intensive and struggled to accurately represent some of the raw materials. By contrast photogrammetry was more cost and time effective and produced better quality results (see attached figure). Sketchfab was an effective, user-friendly platform for disseminating the models (https://sketchfab.com/uremuseum), and student feedback was positive.

Objectives

  1. Produce and evaluate 3D laser scans of 10 lithic artefacts and 5 ceramic artefacts from the teaching collections of Classics and Archaeology, with analysis of 3D model resolution, cost, and time requirements, and dissemination options;
  2. Document student evaluations of the new resources.

Context

Archaeology and Classics have wide ranging teaching collections of objects, both genuine and replica, from the human past (e.g. Greek and Roman ceramics). While students have access to this material in practical classes and seminars, out-of-class access is more difficult, due to (i) the intensive use of the teaching spaces holding the collections, and (ii) the fragility of selected specimens. The project explored methods that could enable students to engage with this material evidence through digital models.

Implementation

The project was primarily undertaken by four Reading students, both postgraduate and undergraduate: Rosie-May Howard (Bsc Archaeology, Part 2), Matthew Abel (BA Museum Studies & Archaeology, Part 1), Daniel O’Brien (BA Ancient History & Archaeology, Part 3), and James Lloyd (Classics, PGR). Supervision and support was provided by Prof. Amy Smith (Classics), Dr Rob Hosfield (Archaeology) and Dr Stuart Black (Archaeology). The four students undertook the following tasks:

(i) Testing the URE Museum’s NextengineTM HD 3D scanner and associated processing software ScanStudioTM to produce 3D laser scan models of selected artefacts (ceramics from the Ure Museum and stone tools from the Archaeology teaching collections).

(ii) Testing 3D printing of the laser scan models using the URE museum’s CubeProTM 3D printer.

(iii) Testing the digital representation of the same range of artefacts through photogrammetry, using memento by Autodesk.

(iv) Trialing the use of Sketchfab as a remote site for posting, storing and accessing the 3D models.

(v) Assessing student responses to the models through a Surveymonkey questionnaire.

Impact

(i) The 3D laser scan models provided volumetric data (unlike the photogrammetry models), but struggled with the regular shapes and repeating patterns which were characteristic of many of the ceramics. The laser scanning process was also time-intensive.

(ii) The laser scanner struggled to represent some of the stone artefacts, with the resulting models characterised by poorly defined edges and ‘holes’, due to the material properties of the flint raw material.

(iii) Photogrammetry was used successfully to create 3D models of ceramics from the Ure museum collection.

(iv) Sketchfab was a flexible interface for ‘touching up’ and annotating the models, and was more user-friendly than other options (e.g. scanstudio).

The quality of the 3D printing was mixed, leading to a decision during the project to focus on digital models that could be accessed on-line.

(v) Students responded positively to the virtual models, and would like to see more in future!

Sample survey questions and responses:

Q: What (if any) other objects/material types would you like to see as 3D models?

A: It would be interesting to see 3D models of smaller, more dainty objects as these can often be difficult to look at on such a small scale.

Q: Do you have any other comments?

A: This is a great project that should keep going! P.S. A scale will be helpful for accurately describing the objects. There’s a Part 2 Archaeology module called Artefacts in Archaeology and the scans could be used as an at-home resource by students.

Reflections

The project was successful in clearly highlighting the relative strengths and weaknesses of the 3D laser scan and photogrammetry methods for creating digital models of artefacts. In terms of cost and time it was clear that photogrammetry was a more effective method, while the experiments with 3D printing emphasised on-line hosts such as Sketchfab as the most effective way of disseminating the models.

More specifically, exploring the photogrammetry option highlighted the potential of the Agisoft PhotoScan software as an effective method for Museums or HEIs wishing to capture large collections for teaching and/or archiving purposes.

Student responses emphasised the importance of providing a wide range of models if these sorts of teaching resources are to be further developed.

Follow up

Archaeology has purchased copies of the Agisoft PhotoScan software and is currently looking to develop a photogrammetry-based digital database of its teaching collections.

At the Ure Museum 3D scans are being made available via Sketchfab and more thorough use of photogrammetry is being considered; virtual models of the vases scanned for CL1GH are being used in seminars this term.

Links

https://sketchfab.com/uremuseum