Connecting with the Curriculum Framework: Using focus groups to diversify assessment (Part 1)

Dr Madeleine Davies, School of Literature and Languages

Overview

The Department of English Literature (DEL) is organising student focus groups as part of our TLDF-funded ‘Diversifying Assessments’ project led by Dr Chloe Houston and Dr Madeleine Davies. This initiative is in dialogue with Curriculum Framework emphases engaging students in Programme Development and involving them as stakeholders. This entry outlines the preparatory steps taken to set up our focus groups, the feedback from the first meeting, and our initial responses to it.

Objectives

  • To involve students in developing a more varied suite of assessment methods in DEL.
  • To hear student views on existing assessment patterns and methods.
  • To gather student responses to electronic methods of assessment (including learning journals, blogs, vlogs and wikis).

Context

We wanted to use Curriculum Framework emphases on Programme Review and Development to address assessment practices in DEL. We had pre-identified areas where our current systems might usefully be reviewed and we decided to use student focus groups to provide valuable qualitative data about our practices so that we could make sure that any changes were informed by student consultation.

Implementation

I attended a People Development session ‘Conducting Focus Groups’ to gather targeted knowledge about setting up focus groups and about analytical models of feedback evaluation. I also attended a CQSD event, ‘Effective Feedback: Ensuring Assessment and Feedback works for both Students and Staff Across a Programme’, to gain new ideas about feedback practice.

I applied for and won TLDF mini-project funding to support the Diversifying Assessments project. The TLDF funding enabled us to regard student focus groups as a year long consultative process, supporting a review of assessment models and feedback practices in DEL.

In Spring Term 2017, I emailed our undergraduate students and attracted 11 students for the first focus group meeting. We aim to include as diverse a range of participants as possible in the three planned focus group meetings in 2016-17. We also aim to draw contributors from all parts of the undergraduate programme.

To prepare the first focus group:

  • I led a DEL staff development session on the Diversifying Assessment project at the School of Literature and Languages’ assessment and feedback away day; this helped me to identify key questions and topics with colleagues.
  • I conducted a quantitative audit of our assessment patterns and I presented this material to the staff session to illustrate the nature of the issues we aim to address. This tabulated demonstration of the situation enabled colleagues to see that the need for assessment and feedback review was undeniable.

At the first focus group meeting, topics and questions were introduced by the two project leaders and our graduate intern, Michael Lyons, took minutes. We were careful not to approach the group with clear answers already in mind: we used visual aids to open conversation (see figures 1 and 2) and to provide the broad base of key debates. We also used open-ended questions to encourage detail and elaboration.

Group discussion revealed a range of issues and opinions that we would not have been able to anticipate had we not held the focus group:

  • Students said that a module’s assessment pattern was the key determinant in their selection of modules.
  • Some students reported that they seek to avoid exams where possible at Part Two.
  • Discussing why they avoid exams, students said that the material they learn for exams does not ‘stick’ in the same way as material prepared for assessed essays and learning journals so they feel that exams are less helpful in terms of learning. Some stated that they do not believe that exams offer a fair assessment of their work.
  • Students wholly supported the use of learning journals because they spread the workload and because they facilitate learning. One issue the students emphasised, however, was that material supporting learning journals had to be thorough and clear.
  • Presentations were not rated as highly as a learning or assessment tool, though a connection with employability was recognised.
  • Assessed essays were a popular method of assessment: students said they were proud of the work they produced for summative essays and that only ‘bunched deadlines’ caused them problems (see below). This response was particularly marked at Part Two.
  • Following further discussion it emerged that our students had fewer complaints about the assessment models we used, or about the amount of assessment in the programme, than they did about the assessment feedback. This is represented below:

To open conversation, students placed a note on the scale. The question was, ‘Do we assess too much, about right, not enough?’ (‘About right’ was the clear winner).

Students placed a note on the scale: the question was, ‘Do we give you too much feedback, about right, or too little?’ (The responses favoured the scale between ‘about right’ and ‘too little’.)


The results of this exercise, together with our subsequent conversation, helped us to understand the importance of feedback to the Diversifying Assessment project; however, subsequent to the focus group meeting, the DEL Exams Board received an excellent report from our External Examiners who stated that our feedback practices are ‘exemplary’. We will disseminate this information to our students who, with no experience of feedback practices other than at the University of Reading, may not realise that DEL’s feedback is regarded as an example of best practice by colleagues from other institutions. We are also considering issuing our students with updates when assessed marking is underway so that they know when to expect their marks, and to demonstrate to them that we are always meeting the 15-day turnaround. The external examiners’ feedback will not, however, prevent us from continuing to reflect on our feedback processes in an effort to enhance them further.

Following the focus group meeting, we decided to test the feedback we had gathered by sending a whole cohort online survey: for this survey, we changed the ‘feedback’question slightly to encourage a more detailed and nuanced response. The results, which confirmed the focus group findings, are represented below (with thanks to Michael Lyons for producing these graphics for the project):

A total of 95 DEL students took part in the survey. 87% said they valued the opportunity to be assessed with diverse methods.

Assessed essays were the most popular method of assessment, followed by the learning journal. However, only a small proportion of students have been assessed with a learning journal, meaning it is likely that a high percentage of those who have been assessed this way stated it to be their preferred method of assessment.

On a scale from 0-10 (with 0 being too little, 5 about right, and 10 too much), the students gave an average score of 5.1 for the level of assessment on their programmes with 5 being both the mode and the median scores.

34% found the level of detail covered most useful in feedback, 23% the feedback on writing style, 16% the clarity of the feedback, and 13% its promptness. 7% cited other issues (e.g. ‘sensitivity’) and 7% did not respond to this question.

66% said they always submit formative essays, 18% do so regularly, 8% half of the time, 4% sometimes, and 4% never do.

40% said they always attend essay supervisions (tutorials) for their formative essays, 14% do so regularly, 10% half of the time, 22% sometimes, and 14% never do.

Impact

The focus group conversation suggested that the area on which we need to focus in DEL, in terms of diversification of assessment models, is Part Two assessment provision because Part One and Part Three already have more diversified assessments. However, students articulated important concerns about the ‘bunching’ of deadlines across the programme; it may be that we need to consider the timing of essay deadlines as much as we need to consider the assessment models themselves. This is a conversation that will be carried forward into the new academic year.

Impact 1: Working with the programme requirement (two different types of assessment per module), we plan to move more modules away from the 2000 word assessed essay and exam model that 80% of our Part Two modules have been using. We are now working towards an assessment landscape where, in the 2017-18 academic session, only 50% of Part Two modules will use this assessment pattern. The others will be using a variety of assessment models potentially including learning journals and assessed essays: assessed presentations and assessed essays: vlogs and exams: wikis, presentations and assessed essays: blogs and 5000 word module reports.

Impact 2: We will be solving the ‘bunched’ deadlines problem by producing an assessments spread-sheet that will plot each assessment point on each module to allow us to retain an overview of students’ workflow and to spread deadlines more evenly.

Impact 3: The next phase of the project will focus on the type, quality and delivery of feedback. Prior to the Focus Group, we had not realised how crucial this issue is, though the External Examiners’ 2017 report for DEL suggests that communication may be the more crucial factor in this regard. Nevertheless, we will disseminate the results of the online survey to colleagues and encourage more detail and more advice on writing style in feedback.

Anticipated impact 4: We are expecting enhanced attainment as a result of these changes because the new assessment methods, and the more even spread of assessment points, will allow students to present work that more accurately reflects their ability. Further, enhanced feedback will provide students with the learning tools to improve the quality of their work.

Reflections

Initially, I had some reservations about whether student focus groups could give us the reliable data we needed to underpin assessment changes in DEL. However, the combination of quantitative data (via the statistical audit I undertook and the online survey) and qualitative data (gathered via the focus groups and again by the online survey) has produced a dependable foundation. In addition, ensuring the inclusion of a diverse range of students in a focus group, drawn from all levels of the degree and from as many communities as possible within the cohort, is essential for the credibility of the subsequent analysis of responses. Thorough reporting is also essential as is the need to listen to what is being said: we had not fully appreciated how important the ‘bunched deadlines’, ‘exams’, and ‘feedback’ issues were to our students. Focus groups cannot succeed unless those convening them respond proactively to feedback.

Follow up

There will be two further DEL student focus group meetings, one in the Autumn Term 2017 (to provide feedback on our plans and to encourage reflection in the area of feedback) and one in the Spring Term 2018 (for a final consultation prior to implementation of new assessment strategies). It is worth adding that, though we have not yet advertised the Autumn Term focus group meeting, 6 students have already emailed me requesting a place on it. There is clearly an appetite to become involved in our assessment review and student contribution to this process has already revealed its value in terms of teaching and learning development.

Using online learning journals

Dr Nicola Abram, School of Literature and Languages

n.l.abram@reading.ac.uk

Overview

This entry describes the use of online Learning Journals on a Part Three English Literature module. This method of assessment supports students to carry out independent research and to reflect on their personal learning journey, and rewards students’ sustained engagement and progress.

Objectives

  • To encourage reflective learning.
  • To promote independent learning.
  • To facilitate weekly cumulative contributions to summative assessment.
  • To reward development rather than final attainment.

Context

The Part Three optional module Black British Fiction (EN3BBF) is characterised by a large number of set texts that are read at a fast pace. During a single term it covers the period from 1950 to the present day, and asks students to engage with novels, short stories, poetry, a play, and a film, as well as critical theory, history, autobiography, documentary, blogs, political speeches, and press reviews. The module is also characterised by its relevance to historical and contemporary issues of social justice. The quantity and complexity of this material requires students to exercise their independence, taking responsibility for their learning beyond the weekly three hours of tutor-led seminars.

Learning Journals had been in use for this and other modules in the Department of English Literature for several years, in the format of paper workbooks pre-printed with set questions. This effectively served the purpose of structuring students’ weekly studies and directing discussion in seminars. Students worked extremely hard to record their learning in this format, often going beyond the standard material to include additional reading and research of relevance to the module.

However, the paper workbook sometimes resulted in an excess of material that was diluted in focus and difficult to evaluate. Another problem was that the handwritten Journal was retained by the University after submission, meaning students lost this rich record of their learning.

To improve this situation, consultations were held with colleagues in the Department of English Literature and an alternative online Learning Journal was initiated in 2015/16.

Implementation

Experimentation with the Blackboard Journals tool helped to clarify its privacy controls, to ensure that tutors could see the work of all participating students but that students could not see each other’s entries. A discussion with the University of Reading TEL team clarified marking procedures, including making the Journal entries available to view by external examiners.

A discussion was held with colleagues who use paper or online Learning Journals, to establish generic assessment criteria and ensure parity of expectations.

In discussion with another module convenor it was decided that students would be required to submit ten weekly entries, each consisting of 400-500 written words or 4-5 minutes of audio or film recording. The choice of media was a proactive effort to make the Journal more accessible to students with dyslexia and those for whom English is an additional language. The subject of each entry could be determined by the student, prompted by questions on the reading list, discussion in seminars, personal reading, or other activities such as attendance at an exhibition or event.

In the first term of implementation (Autumn 2015) the full ten entries were assessed. In later iterations it was decided that students should instead select five entries to put forward for summative assessment. The selection process facilitates further self-reflection, and the option to discard some entries allows for experimentation without the threat of penalty.

The Learning Journal incorporates a vital formative function: students are invited to a 30-minute feedback tutorial to discuss their first five entries. This conversation refers to the module-specific and task-specific assessment criteria, supporting students to reflect on their work so far and to make plans to fill any gaps. The Learning Journal functions as a mode of assessment for learning, replacing the traditional task of the formative essay.

In terms of summative assessment, the five submitted Learning Journal entries account for 50% of the module mark. An essay constitutes the other 50%. These two forms of assessment are equivalent in scale, with each carrying a guideline of 2,500 words total.

Impact

The fact that students could nominate a selection of entries for summative assessment seemed to encourage risk-taking. Students were more willing to experiment with their critical responses to texts – by testing speculative interpretations, asking questions, or articulating uncertainty – and to express their ideas using creative practices. They became actively engaged in directing both the form and content of their learning.

The move to a restricted length per entry was designed to encourage students to distil their ideas, and to direct attention to the aspects of that week’s learning that most mattered to the student. This was successfully achieved, and feedback shows that they could see their own progress as the weeks passed.

Feedback also showed that students appreciated the opportunity to choose their own topic for each weekly entry, without the constraints of set questions. As a result, entries were remarkably varied. Some students took the opportunity to reflect on their personal circumstances or current political contexts (such as the construction of ‘Britain’ in the discourse around the EU referendum in 2016) using the technical vocabulary learned on the course; others explored creative media such as spoken word poetry. All students gained skills in a genre of writing different from the traditional essay format, which may prove useful for careers in the communication industries.

One unexpected benefit was that the online journal made it possible for the module convenor to track the students’ learning in real-time rather than waiting for summative assessments and end-of-term evaluations. This immediate insight enabled corrective action to be taken during the course of the module where necessary.

Reflections

Students were initially nervous about this unfamiliar method of assessment. Providing detailed module-specific and task-specific marking criteria, as well as example entries, helped to allay these fears. The decision to count only a selection of entries towards summative assessment significantly helped, allowing students to acclimatise to the task with more confidence. As the term progressed, students visibly transitioned towards autonomous learning.

The Learning Journal format proved particularly effective for this module as it created a ‘safe space’ in which students could reflect on the ways in which they have personally experienced, witnessed, or practised racism. Students’ self-reflection extended beyond the subject of skills, strengths and weaknesses to consider their embodied knowledge, ignorance, or privilege. They became more critical in their thinking and more alert and responsible as citizens. Articulating the potency of this real-world engagement, one student commented that “the consistency of the learning journal […] allowed my thinking to naturally mature and changed my outlook on society”.

Marking the Journals became much more efficient using the online format, as entries were typewritten and significantly condensed. Additionally, marking and moderating could be done remotely, without the need to exchange cumbersome documents in person.

It is striking that some students achieving high marks in their Learning Journals did not always achieve equivalent marks in their essays or other modules. I do not consider this to indicate an artificial inflation of grades; rather, I would argue that the Journal recognises and rewards skills that are overlooked in traditional assessment formats and undervalued elsewhere on our programmes. Some students used the Journal to record their personal contribution to seminar discussions and be rewarded for this, while for other students less likely to speak in class (perhaps due to EAL status, gender, disability, or personality) the private entries provided an important opportunity for their insights to be heard.

Follow up

Informal spoken feedback on the general use of Learning Journals was given to the group during seminars, and one-to-one feedback was given halfway through the module. However, several students sought additional reassurance about their entries. In 2017/18 I intend therefore to incorporate a peer-review exercise into the early weeks of the term, to allow students to benchmark their work against others’ and to promote the take-up of alternative media and approaches. This activity will help students to see themselves as a community of learners. Rather than presume that students have access to technology I will supply iPads belonging to the School of Literature and Languages for use in the classroom.

I also intend to circulate example entries in audio and video formats, to show that the Journal validates skills other than traditional essay-writing and to encourage students to experiment with alternative ways of demonstrating their learning.

Using screencasts to deliver skills training: a Part One English Literature module

Dr. Nicola Abram, Literature and Languages
n.l.abram@reading.ac.uk

Year of activity: 2015-6

Overview

camtasia-in-action

This entry describes the use of screencasts to deliver skills training on a compulsory Part One English Literature module. As a result of the changes outlined here, every student taking English Literature at the University of Reading will have access throughout their degree to a bank of online resources teaching key skills.

Objectives

  • To train students in the practical skills needed to succeed in an English Literature degree.
  • To induct students into the independent learning required for an English Literature degree.
  • To increase students’ engagement in skills training.
  • To improve students’ understanding of and adherence to academic conventions.
  • To make best use of the contact time (lectures and seminars) on the module.

Context

Over 200 students enter English Literature programmes at the University of Reading each year, from a range of educational backgrounds. To ensure they all have the key skills and theoretical understanding needed to succeed throughout their degrees, we run a compulsory module in Part One (first year) called ‘Research & Criticism’ (EN1RC).

In the previous incarnation of the module, the Autumn Term had been used for a series of 50 minute lectures on research methods, such as ‘Using online sources’, ‘Using published sources’, ‘Citations and referencing’, and ‘Academic writing’. Students also attended a 50 minute seminar each week, the content of which was determined by the seminar tutor. The Spring Term lectures and seminars then inducted students into foundational critical ideas like ‘narrative’, ‘reader’ and ‘author’, as well as issues such as ‘gender and sexuality’, and ‘race and empire’, via a series of set texts.

I was tasked with convening this module from 2014/15. On my appointment, I sought to engage students as more active participants in the skills training component.

Implementation

The process for developing this module began with an informal conversation with another tutor. We identified a disparity between the module content and the mode of delivery: the traditional lecture format did not seem to be the best vehicle for delivering skills training.

Believing that skills training is most effectively conducted through practical and interactive activities, I set about constructing a series of short formative tasks that would enable students to learn by doing. These were designed to break down the process of research and writing into its component parts, so that students could amass the necessary skills bit by bit. Feedback would be given quickly – usually the following week – by their seminar tutor, meaning changes could be implemented prior to attempting a summative (assessed) essay. The specific formative tasks set were: assembling a bibliography, integrating quotation into a short critical commentary, preparing an essay plan, summarising a fiction text, précising a critical text, and drafting an essay introduction.

Students were supported to undertake each task by a screencast: a short (3-5 minute) animation giving the key information about a particular skill and signposting further resources, which students could watch at their own pace and return to at leisure. Screencasts were released to students on a controlled basis via a dedicated area on the module’s Blackboard pages, accompanying the instructions for each formative task. Upon completion of the module, students had therefore engaged with a bank of ten different screencasts. They retain access to this throughout their degrees, via Blackboard.

Most of the screencasts were prepared using the screen capture programme, Camtasia, for which we have multiple departmental licenses. Colleagues who had previously delivered the skills lectures were given the technical support (where necessary) to repurpose that material into a screencast, and others were invited to volunteer new material. A colleague in Study Advice also contributed a screencast tailored to the needs of English Literature students. This collaborative approach produced a welcome range of different outcomes. Some colleagues used PowerPoint to present written and visual content, while others used Prezi, which better represents the spatial arrangement of the material. Some recorded a voiceover, which provided a welcome sense of connection with an individual tutor, while others chose to use a musical soundtrack downloaded from a royalty-free website such as www.incompetech.com. A few colleagues used the animation tools PowToon and VideoScribe, rather than simply recording a presentation onscreen.

A meeting with staff teaching on the module was held at the end of its first term and after its first full year. Their reflections on students’ submitted tasks and classroom engagement proved invaluable for the module’s iterative design.

Impact

As a result of this module, students are evidently more alert to the many components of professional writing and are better equipped to perform good academic practice. Selected comments from qualitative module evaluations affirmed the usefulness of this immersive model of skills training: “The first [formative] tasks such as the bibliography were very useful to bridge the gap into HE”, “All the feedback I received was very helpful and helped me improve my work”, and “The screencasts were also a fantastic idea”.

The screencasts have been watched multiple times by students, suggesting that they are a useful resource that can be returned to and referred to repeatedly. The current most-watched is ‘Incorporating quotations’, which has had 969 views since it was uploaded in January 2015.

Using screencasts as a teaching delivery tool has also provided the opportunity to develop the content of the course. Removing the skills content from lectures freed up contact time to be given to important theoretical material and set texts.

Reflections

The model of interactive skills training harnesses the power of constructive alignment, where teaching process and assessment method are calculated to maximise students’ engagement with the subject and/or skills being taught. Even for a discursive discipline like English, the QAA Subject Benchmark Statement encourages assessments “aimed at the development of specific skills (including IT and bibliographical exercises)”.

Although I did not have a particular student demographic in mind when making these changes, the staged development of writing skills seems to offer specific support to international students and English as additional language (EAL) learners, who may be unfamiliar with UK academic conventions and benefit from an atomised approach to writing with regular formative feedback. However, all students benefit from this formal induction to academic literacy. Running a core skills module has an equalising effect on the cohort, compensating for disparities in prior educational contexts and attainment.

Embedding the screencasts to view on Blackboard Learn was awkward since they could not be watched inline by users whose devices did not support a specific plugin. Screencasts were therefore hosted on www.screencast.com, with stable links provided in Blackboard Learn. Both uploading and viewing were easy and effective, but the cap on bandwidth (2GB per month) meant a need to upgrade to a paid-for subscription (currently £8.36 per month) in months where traffic was particularly high. In future I will consider using YouTube, with appropriate privacy settings, to continue the periodic release of screencasts through link-only access.

Follow up

As of 2016-17, the module continues to run using screencasts as a key teaching method. Additional screencasts have been added to the suite as need arose, for instance to support students’ use of Turnitin as a formative tool, in line with University of Reading strategy. Some screencasts have been replaced as a result of staff turnover. But most remain in use, meaning that the initial work to prepare the content and conduct the screen capture continues to pay off.

Various colleagues in the Department of English Literature have found screencasts to be a useful method for wider skills training. We are now preparing a suite of screencasts to support prospective students and new entrants with the transition to higher education, on topics like ‘What is a lecture?’ and ‘How should I communicate with my tutors?’. We also use screencasts more widely, including as a student assessment method: some of these, along with our public-facing promotional videos, have been given British Sign Language interpretation (contact Dr Cindy Becker for details).

Work is now being undertaken to enhance the training component of the module further through Technology Enhanced Learning, by using quizzes on Blackboard Learn to provide students with immediate feedback on their understanding of skills like proper referencing practice.

Links

Academic Writing: Essay presentation & proof-reading:http://www.screencast.com/t/EXn2au7r8Wj7

Writing a critical precis: http://www.screencast.com/t/83Wz0I4rA

Citations and referencing: http://www.screencast.com/t/aT8PolyDuH

English Literature at the University of Reading YouTube Channel: https://www.youtube.com/user/EnglishAtReading

Henley Business School staff guide to using Turnitin to aid identifying and dealing with academic misconduct when marking

Edith Rigby, Henley Business School
e.rigby@henley.ac.uk

Overview

A review of existing University of Reading assessment advice and two staff workshops at Henley Business School to inform a new marker’s guide on fair marking and managing Academic Misconduct. The marker’s guide will specifically cover what to look out for and when and how to use Turnitin Originality Reports. The guide can also be used at staff development workshops.

Objectives

To produce a staff guide that:

  1. Distinguishes plagiarism, poor academic practice and academic misconduct
  2. Outlines different staff roles in relation to marking and assessment feedback
  3. Provides a training tool for new staff
  4. Promotes consistent marking and feedback practice including when or how to use Turnitin Originality Reports.

Context

Turnitin Originality Reports are used widely to detect potential academic misconduct. While there is some information on how to use Turnitin to help identify academic misconduct, there are no sessions or workshops on best practice specific to Henley Business School. Discrepancies can therefore able to arise across programmes in how similarity reports are used to advise students or inform marking.

Developing a new marker’s guide to best practice within Henley Business School when marking could ensure a more consistent student experience.

Implementation

Two workshops with the School Director of Teaching and Learning, Directors of Studies and Programme Directors were held to first identify current processes and areas of concern around academic misconduct, and specific areas on which guidance was needed. Then the structure and core content for a new marker’s guide were agreed. The core content was to include: definitions of roles of admin teams, module convenors, markers and Directors of Studies; core definitions; processes; and advice on basic and best practice for new markers.

Then a basic but flexible guide with space for users to add more examples and narratives of best practice was developed. This was based on the results of the two workshops and a review of the Henley Good Academic Practice guide and test for students, and University of Reading advice and documentation around academic misconduct.

Core staff were invited to contribute to the guide as a work in progress, and this draft guide was used at staff workshops.

Impact

The basic guide achieved the project objectives.

Different disciplines across Henley Business School have different needs and collating contributions from busy academics has resulted in a guide that is currently best used for workshops only. Once more contributions are forthcoming an online version can be developed as required.

The key impact of this project has been to generate discussion and share practices around the

  • purpose and processes of marking
  • different types of assessment and the assessment literacies required for staff and students
  • handling large group assessment and marking.

Reflections

Being able to identify what new markers need to know about roles and processes has turned out to be essential given the changes to the Academic Misconduct policy over the last two academic years. Academics and programme administrators alike have found this part of the guide more than helpful.

Henley Business School Directors of Studies also found the workshop discussions useful and pertinent to assessment aspects of their roles.

More workshops with targeted academics across an academic year as part of the project plan would have elicited more content.

Overall the work on this project has informed other work on eAssessment and eFeedback at Henley Business School, and will be revisited as part of the Henley review of assessment and feedback.

Follow up

In time more contributions will be sought so that the guide can better illustrate the differences between undergraduate and postgraduate marking requirements. It can then be made more interactive for web self-access or use in workshops.

Links

 Henley Marking Guide to Academic Misconduct and using Turnitin Originality Reports
If you need to edit this, please contact Nicola Langton (nicola.langton@henley.ac.uk).

Closing the ‘feedback loop’ using Unitu: Student uptake, usage and impact of a new online student feedback platform

Dr Emma Mayhew, Politics, Economics and International Relations
e.a.mayhew@reading.ac.uk
Year of activity: 2015/16

Overview

PLanT funding was used to research the impact of a new student feedback platform in Politics. Unitu creates an online student forum from which representatives pull issues onto a separate departmental board. Academics can then add responses and show if an issue has been actioned or closed by dragging between columns.

Objectives

  • To monitor student uptake and usage figures across the year.
  • To increase our understanding of the impact of the platform on the student experience.
  • To look more deeply at the effect of continuous feedback, via Unitu, on teaching and learning provision within the Department.

Context

Increasingly, providers are looking for alternative ways to encourage more continuous student engagement by opening channels of communication between staff and students to target further improvements to the student experience. This is particularly timely given the Teaching Excellence Framework and changes to National Student Survey questions which stress the importance of the student voice.

Implementation

In order to investigate student uptake, usage and impact of Unitu, the project team adopted a three stage approach:

  1. To survey students to assess their knowledge and experiences of Unitu. 120 questionnaires were received from students across all parts of the Department.
  2. To conduct a focus group of between 8 and 10 students to help draw out themes surrounding the student experience of using Unitu and impact of the platform on provision and satisfaction.
  3. To research the experiences of the University of Roehampton to look at how this provider raised awareness of the platform, how they encouraged student engagement and what kind of impact they had seen.

Impact

We now have easily accessible sign up and usage figures across the year. We can see how sign up figures respond to our promotional activity. We have a much better understanding of why some students were not aware of the platform, how some students encountered initial technical difficulties with sign up, why some purposefully prefer not to engage with Unitu and, for those that have, which features are of particular use and which are not. This data has led to changes in our approach to student communications and liaison with the software provider to amend some of the features offered.

Reflections

Although some features were problematic, such as numerous ‘new post’ email notifications, the overall response was positive. 58% of students enrolled onto the platform. 55 issues, questions or praise were posted, prompting 5,500 student views of follow on discussion. 52% found Unitu increased student representative profiles. 61% felt it improved the student voice. 75% felt it showed exactly how the Department responded to student feedback. Some changes were made to teaching provision in response to student feedback including addressing deadline clusters and balancing assessed and non-assessed presentations. Notably the platform offers academic colleagues the opportunity to explore the pedagogical rationale behind curriculum design and assessment decisions. But we do remain mindful of the way in which Unitu might lead to difficulties managing student expectations in terms of the timing and nature of responses as well as the impact of adopting a very open discussion forum which does require clear rules of engagement.

Follow up

We have started work on broader dissemination of our experiences. In September 2016 a Part Two student, Jack Gillum, presented as part of a broader University of Reading symposium at the Researching, Advancing & Inspiring Student Engagement (RAISE) conference in Loughborough. Unitu is now being considered by Computer Science and the School of Construction Management and Engineering. We would like to continue to share our experiences with new adopters.sp

Game-based learning using social media

Dr Stanimira Milcheva, Henley Business School
stani.milcheva@henley.reading.ac.uk
Year of activity: 2015/16

Overview

We designed a simple game (called the REFinGame) which was aligned with the course material and launched it on Facebook. This approach, which could easily be applied to other discipline areas, was successfully used to enhance student learning and engagement with modules related to real estate finance.

Objectives

  • Allow students to develop transferable skills.
  • Allow students to apply course material in a real-world scenario.
  • Provide immediate and personalised feedback.
  • Improve interactions among students and between students and the lecturer.
  • Help make the module more interactive and enjoyable for students.

Context

Real Estate Finance and Debt Markets (REMF41), is a master’s module within Henley Business School. During the module students gain an awareness of the financing process for real estate from both a borrower’s and a lender’s point of view. The game was designed so that students could apply course material and learn to assess the risks associated with financing decisions.

Implementation

First, together with Professor Charles Ward, the REFinGame was designed before the beginning of the module. The design had to take into account the course material and make simplifying assumptions so that the game could be modelled to best represent reality. The idea was that students would play the game over the course of the module outside the classroom. The game is about making financing decisions. Students are split into property developers (investors) and lenders (banks). The developers make decisions on how many properties to develop depending on how much money they have and how much finding they need from the bank. Moreover, they decide on the type of the properties, the location and other characteristics. The banks decide how much funding to provide to each developer. The game is played on Facebook on a weekly basis as information is introduced on the Facebook Wall each week. Students advertise properties on the Wall, and a decision is made by the game coordinator on the transaction price of the buildings, based on the total supply by developers and the macroeconomic situation in that period. The main idea is that students learn to assess the risks associated with financing decisions as they can lose the virtual money they have available by making the wrong decisions. The game is won by the student who accumulates the greatest amount of money.

A closed Facebook group was created for the module, a logo was created for the game, and students were briefed how to play the game. The developers and lenders had to negotiate loan conditions using Facebook messages. They then advertised the properties they developed by putting pictures and information on the Wall. The purchase prices are then communicated to the developers by private message. Information about the economy and the markets us distributed as a post on the Wall. Students have to fill in a spreadsheet each week and send this to the game instructor. The game instructor then provides feedback to each student. At the end of the game, students shared their experience of the game by giving a presentation in which they presented their strategy and performance throughout the game and compared it to their peers. These presentations are assessed.

Impact

A significant relationship was found between the students who performed well in the game and their overall module mark. Less tangible outcomes are that the game can help students develop skills such as problem solving, creativity, and strategic behaviour, and also increases the interaction among students and between the students and the lecturer. In particular we found that playing a game on Facebook helped to better integrate students who might be more reticent in class discussions. The lecturer can develop a better idea of each student’s performance leading to students receiving tailored and regular feedback and being able to improve throughout the game. This is one of the main advantages that students identified, along with the playfulness of the game, and the ease with which the game is played on Facebook. The major issues students faced were the perception that course material is not directly applied in the game. This demonstrates that it is important to manage student expectations as well as have a structured approach when it comes to game design. Ultimately, our goal is to create guidelines for using self-designed simple games incorporating Facebook, and improve student learning.

Reflection

The novelty of our approach is that we did not design a video game or a digital game using special software, but instead designed a simple game to be played online using Facebook as a platform. We wanted to show how with limited resources and time an instructor can construct a game and engage students with it, as Facebook is free and widely used by students. We have observed that the main challenge in the design of the game is to ensure that it aligns with the course material and to manage student expectations. For this purpose the instructor should very clearly explain how the game can benefit the students and how they will be assessed. Also, it is crucial to communicate how the course material can be used within the game to make decisions. For this purpose, the game designer needs to make sure that the students see the direct link between the course material and the learning outcomes of the game.

Blackboard Collaborate cross-campus tutorials as a useful tool to enhance the Part One Pharmacy student experience at the University of Reading Malaysia

Dr Darius Widera, School of Chemistry, Food and Pharmacy
d.widera@reading.ac.uk

Overview

After a successful application to act as one of the early adopters of Blackboard Collaborate at the University of Reading, this technology platform was used for a series of cross-campus tutorials within the Fundamentals of Physiology (PM1AM) module between the University of Reading’s Whiteknights and Malaysia campuses. The format was well-received, and contributed to an enhanced student experience.

Context

The official inauguration of the University of Reading Malaysia (UoRM) campus in EduCity, Johor Bahru, in early 2017 and the start of the MPharm (Malaysia) programme in the academic year 2016/17 offer excellent opportunities for further internationalisation of the University of Reading and specifically within Pharmacy education.

The University of Reading Malaysia offers a double accredited (UK and Malaysia) 2+2 MPharm (Hons) degree where the students study for two years at the Malaysia campus followed by two consecutive years in Reading.

The PM1A module and its UoRM counterpart, PM1AM, cover the basics of biology and human physiology including genetics, biochemistry and cell biology. According to student feedback, these topics tend to be challenging for the students, especially in light of the fact that significant numbers of Pharmacy students do not have A-level biology to provide background knowledge.

In response to this feedback, several tutorials have been introduced to provide students with interactive opportunities to revise the content of lectures and practical sessions and to close any potential knowledge gaps.

Thus, there was a need for the development of a cross-campus solution to ensure that both MPharm cohorts (UoR and UoRM) are provided with a similar form of tutorials.

Objectives

  • To explore if Blackboard Collaborate can be used for cross-campus delivery of tutorials covering the content of the genetics lecture series within the PM1A/PM1AM module.
  • To investigate if cross-campus virtual classroom/teleconference represents an appropriate pedagogical tool for delivery of tutorials in Pharmacy and how this deliver method affects student engagement and interactivity.
  • To assess if these sessions could help 2+2 MPharm students to prepare for their two years of study in Reading.

Implementation

The Blackboard Collaborate platform was used to develop a series of tutorials in genetics. The online sessions were led by Dr Widera (live video capture via a webcam) at the University of Reading’s Whiteknights campus and streamed to students at the UoRM. The student group was composed of 11 Malaysian Part One MPharm students. The content of the tutorials was covered in the respective lecturees. It was expected that students would have factual knowledge of the topic, although at heterogeneous levels.

All students were equipped with PCs with headsets and webcams. Blackboard Collaborate functions including ‘raise hand’, virtual whiteboard, chat, and direct interaction with all or individual students (either via audio or video) were used. In addition, external tools (e.g. Microsoft PowerPoint presentations and the Poll Everywhere app) were used via the ‘share screen’ function of Blackboard Collaborate. For the tutorial, an introductory PowerPoint presentation was designed and a screencast deposited on YouTube as a contingency plan. Multiple choice questionnaires (MCQs) were set up on the Poll Everywhere platform, and short answer questions (SAQs) were included in an additional PowerPoint presentation. After each MCQ/SAQ, students were given time to decide on an answer (individually via Poll Everywhere), followed by an interactive discussion.

The overall length of each tutorial session was 50 minutes. Individual anonymous post-hoc feedback was collected to evaluate student opinions on the usefulness, overall style, and delivery. In addition, a technical report and an experience log was collated and submitted to the Technology Enhanced Learning team. Finally, the content, deliver and potential changes were discussed with students and peers during a visit to the UoRM.

Impact

During the tutorials no serious technical issues were encountered, although students at UoRM did experience slight lagging in their connections (with video and audio becoming slightly out of sync). Students showed high levels of interaction and successfully used most of the Blackboard Collaborate features. Importantly, other than in UoR in-class tutorials, students engaged and interacted early on. This is reflected in the feedback collected after the first session (“I like how it is interactive and fun”). The tutorial format also seemed to help students to revise the content of the lectures (“Useful to enhance my biology knowledge”, “It helps me to revise”, “It helps me to find out my difficulties with previous lectures”). Moreover, students appreciated that the session was different compared to conventional lectures (“It was different from just sitting in the classroom and listening to lecturers”, “it was another way of learning outside the classroom”). Last but not least, it was appreciated that the tutorials were run by Reading-based staff that the 2+2 students would meet during their two years in Reading (“can meet Dr Widera and learn from him”). No negative feedback was received.

Follow up

Following the feedback received, further tutorials involving other lecturers teaching on the PM1A module will be developed and implemented.

E-submission, marking and feedback – Pilar Gray-Carlos

OBJECTIVES

  • To facilitate the administrative process in submission of summative assessment
  • To inform module convenors and language teaching fellows of the tools supported by the University LMS Blackboard Learn
  • To provide the opportunity to apply the above tools, gather experience and inform decision on best approaches and best practice
  • To explore usability and applicability of existing marking criteria in the form of Tii (Turnitin) rubrics
  • To explore and facilitate a transition to use a basic set of QuickMarks across the Department whilst enabling room to create language specific amendments
  • To facilitate timely and transparent accessibility of results for students via the Grade Centre

CONTEXT

As part of summative assessment, students of intermediate to advanced language courses in IWLP Chinese, French, Italian, Japanese, German and Spanish submit a project (between 600 and 1000 words or characters according to language and stage) researched and written in the Target Language.

IWLP deals with a large volume of students each year so it was important to explore ways of facilitating a point of submission that would enable staff to easily follow up submission deadlines and late submissions eliminating paper based trails and multiple parties involved in the process, making it timely and easily accessible for staff to keep track of submission.

As the majority of language teaching staff works on part-time basis, it was felt that it would be of advantage to have a point of access to student´s work from different locations. This also meant adopting electronic marking and feedback as a way to facilitate marking and moderation remotely.

Three years ago it was unclear whether Tii would support the modern languages provided by the IWLP programme. Once it was established that it did support modern languages, it was felt that the use of similarity reports would both assist teachers in detecting plagiarism and be good for student learning as it would force students to revise not just content but language as well and re-write when necessary.

One of the advantages of using electronic submission, marking and feedback is that both the marking criteria and the feedback can be provided in the same space, therefore avoiding reprinting and waiting for students to collect feedback. Language projects are assessed on the following areas: content, structure, vocabulary, grammatical accuracy, range of expression, syntax and variety of grammatical structure. The aim was to upload the project marking criteria in the form of rubrics hence facilitating all the tools for marking and feeding back in one place for tutors, providing an area readily available for moderation, and granting ease of access to results and feedback for students.

There were two e-submission options to be explored: e-submission with inline grading or e-submission via Tii assignment submission, the latter supplying the facility to use rubrics and quick marks via the Turnitin Suite.

IMPLEMENTATION

Initial meetings took place three years ago with members of the TEL team which highlighted the advantages of using the electronic submission of written work. The meetings involved coordinators and module convenors of the languages that initially provided intermediate to advanced stages: English for Erasmus, French, German and Spanish. It was then agreed to pilot the use of electronic submission and to initially explore the use of “inline marking” tools for marking and providing feedback.

Further training was arranged, delivered by both the TEL team and Pilar Gray Carlos and on-going support was provided on an ad-hoc basis.

The first round of assessments took place and the feedback collected from tutors was varied. Some colleagues developed feedback systems utilising tools such as colour underlying and text boxes. As not only the content but the language is assessed, and identifying, correcting and explaining language mistakes can be a detailed process it was felt that, not only it took time to get familiar with the new system but that the result of the corrections and feedback was not easily accessible to students, making it necessary to print student´s work and go over corrections and feedback again with students in the classroom.

A period of required e-submission, but voluntary use of electronic marking and feedback followed until there was confirmation that Tii supported other modern languages. At this time modern languages such as Mandarin Chinese and Japanese had added intermediate courses to their provision. It was then decided to take the opportunity to start using Tii also as a formative tool, and in doing so, familiarising students with its use and enabling them to self-evaluate and readdress their own work. The use of similarity report was enabled for formative submission during the course and in view of the final submission of summative coursework.

Opportunities for training by the TEL team and in-house training were organised and provided by Pilar Gray Carlos and more experienced colleagues within ISLI. In this way module convenors and tutors were shown how rubrics and QuickMarks are used for marking and feed back in language teaching (see Rob Playfair case study and Jonathan Smith´s interview).
At the same time, and in parallel with work on e-submission, marking and feedback, a Grade Centre was created for the 31 modules provided in the 10 different languages. Weighted columns were created per assessment per module, teachers could directly input results and students would have direct access to marks as they were released.
Having all that data available also meant that, although limited, some reports could be printed with regards to module performance per assessment and even for languages where classes are taught in parallel groups, group performance data reports could be produced.

Since then the EMA Core Systems Team has delivered a more streamlined process which produces similar data sets on RISIS (for more detailed information see the EMA Programme short videos link below).

IMPACT

The use of e-submission has enabled a variety of approaches to formative assessment to flourish, some languages have made the most of using e-submission to collect student´s work and to feedback on line.

As per summative assessment, the adoption of QuickMarks is facilitating marking, and once the teachers get accustomed to using them it becomes an efficient way to point out generic language errors.

The use of the Grade Centre was a success, as it cut down on administration, freeing time on the side of administrators and teachers and it provides helpful information as to the performance of certain cohorts and groups. The only drawback was the missing step between Grade Centre and RISIS. At that point in time the only way to update records in RISIS was by downloading all marks in the form of a spreadsheet and manually inputting them in RISIS. The EMA Programme Core Systems Workstream are working to improve the integration between Blackboard and RISIS.

REFLECTIONS

The feedback obtained from the teachers indicates that there is a healthy satisfaction surrounding e-submission, it is also positive with regards to marking content but it is divided about how to approach correction and feedback on language items as they can be as particular as the individual but also as the language itself. In this sense written adjustments and examples need to be inserted in the text, an option that seems to be faster in paper rather than electronically but in the very specific context of inserting grammatical symbols in a text in language teaching there might be some additional thinking. The EMA Team are looking at requirements surrounding scientific, mathematical and grammatical type notations within the University and possible ways
forward.

The general consensus is that at present out of the two options Tii is a better option for language projects than inline marking. In order to enable that transition we need to look into the set of rubrics we are using and adopt sets of QuickMarks applicable to all languages, with perhaps addition of specific sets for non-Latin language scripts.

FOLLOW UP

There will be a small working group set up to revise QuickMarks across all languages. This working group will also look into the rubrics and how can we best customise them for our assessment purposes and in line with CEFR (Common European Framework of Reference for languages)

LINKS

EMA Project Reading Resources

http://www.reading.ac.uk/internal/ema/ema-resources.aspx

Common European Framework of Reference for Languages (CEFR)

https://www.coe.int/en/web/common-european-framework-reference- languages/?

Reading University Observatory: A web-based resource for 21st century teaching and learning

Dr Andrew Gabey,

School of Mathematical, Physical and Computation Sciences

a.m.gabey@reading.ac.uk

Year of activity: 2016/17

Overview

The University’s Atmospheric Observatory continuously collects high-quality environmental data, which is used heavily in teaching courses – particularly in Meteorology.  A new web-based system, due to go into service for the Autumn semester, has been developed under this project so that the data is (i) more easily accessed by students, and (ii) pulled automatically into other software applications, such as interactive websites, for either teaching or outreach. Alongside these impacts, the system represents a more manageable way to disseminate data, and is a helpful case study for developing digital offerings using Cloud technologies supported by University IT Services.

Objectives

We aimed to build a modern environmental data service based on data from the University Atmospheric Observatory that:

  • Provides an improved user experience for students in the various classes using this data.
  • Can be accessed by anybody with permission, on or off-campus.
  • Supports development of data-driven applications, including interactive websites, that help explain the environment and climate.

Context

Meteorology departments generally teach with data from their own atmospheric observatories, often using clunky methods. Our school website provides an on-campus-only service for students to access data needed for Meteorology, Sustainability, Biology and Geography classes, but the software for this has grown organically and has reached a point where the user experience is somewhat overwhelming. This technologies used are also unsuitable for modern applications such as interactive data-driven websites that could showcase the university’s facilities.

Implementation

Stakeholder input and co-ordination: Meetings were held with the departmental data manager, laboratory technicians, other research staff interested in sharing data efficiently, and the HoD responsible for funding the on-going computing cost for operating the service. As they were engaged during the proposal writing, these discussions were broadly positive and yielded useful considerations such as the need for legal wording on the website.

Design and implementation of software: The proposal document was used to inform technical requirements passed to the programmers. These focussed on the different journeys taken by service users and administrators, and feedback between the programmers and I helped smooth interpretation.

Standing up the service: University IT Services were happy to explore ways of helping people to deliver services using cloud-based approaches, and even covered the first few months of running costs while we determined how things should work in terms of finance and support.

Documentation and support: The completed code is stored on the GitHub website, along with installation and administration instructions for system maintenance and, hopefully, the addition of more data holdings and users as time goes on.

Impact

Expected outcomes

To ensure successful outcomes, we established technical requirements based on the planned benefits to teaching and learning: Improved user experience through a better user interface; accessibility from anywhere; allowing the data manager to tailor data for classes/individuals, and employing more modern web technologies.

Based on these technical criteria, these have all been solidly realised, and the system is being stood up to be used by Meteorology students as the new academic year begins (subject to ITS support). Initial user feedback has been positive, with test users able to extract data without needing much help. When help was required it was mostly caused by bugs, which have been resolved (see follow-up for more feedback).

Unexpected outcomes

IT Services: We employed Microsoft Cloud technologies to power the service, and this in turn has allowed IT Services to determine how they can support groups within the university keen to innovate in this way.

Technological development within the department: This software has formed the basis of a similar tool to share research data elsewhere in the department, and can in theory be applied to many such datasets.

Reflections

As this was fundamentally a software project, it was essential to have well-developed requirements and criteria for success. These were worked through in detail at the start, and left enough room at the end that small extra tasks could be completed to refine the finished product.

The hardest part was spending the money: Although the University Careers centre were very helpful we were unable to secure any suitable interns, having advertised it as a summer project. An email to the departmental PhD students yielded a pair with the perfect background, and the work was completed within the increasingly tight deadline, and to budget, paid via ad-hoc work forms. Appealing to PhD students first rather than holding out for a summer intern would have been the wiser course.

A more impactful result would have been achieved if we had built some demonstrations of how the new system could be applied. For example, web-based data visualisation would show how accessible the data is; and negotiating with the University to make some of the archive available to the public would have been helpful for outreach. Public datasets are supported in the software, so a decision to make data available is easy to implement.

Follow up

Initial feedback was positive from teaching support staff, and constructive criticism was taken on board. For example, a test user was able to choose invalid dates like 31 April, which resulted in errors. Concerns were also raised about it being hard to go back and change options when a mistake was made. Refinements were made (reflected in images below) to address these.

Links

Atmospheric Observatory

sample webpage from the online database

Service homepage showing information on its use and how to get access to data.

sample webpage from the online database

Some of the options presented to the student user for data download. They are only presented with relevant information, and interface elements such as interactive date pickers are employed to make the experience more intuitive.

sample webpage from the online database

One of the administration screens allowing specific parts of the University’s large data archive to be assigned to a student, rather than presenting all possible options to them.

Virtual teaching collections in Archaeology and Classics: turning artefacts into 3D models

Dr Robert Hosfield, School of Archaeology, Geography and Environmental Science

r.hosfield@reading.ac.uk

Year of activity: 2015/16

Sample image

Lykethos

Overview

The project tested different methods for producing and disseminating 3D models of existing artefacts in the teaching collections of Classics and Archaeology. 3D scanning was labour intensive and struggled to accurately represent some of the raw materials. By contrast photogrammetry was more cost and time effective and produced better quality results (see attached figure). Sketchfab was an effective, user-friendly platform for disseminating the models (https://sketchfab.com/uremuseum), and student feedback was positive.

Objectives

  1. Produce and evaluate 3D laser scans of 10 lithic artefacts and 5 ceramic artefacts from the teaching collections of Classics and Archaeology, with analysis of 3D model resolution, cost, and time requirements, and dissemination options;
  2. Document student evaluations of the new resources.

Context

Archaeology and Classics have wide ranging teaching collections of objects, both genuine and replica, from the human past (e.g. Greek and Roman ceramics). While students have access to this material in practical classes and seminars, out-of-class access is more difficult, due to (i) the intensive use of the teaching spaces holding the collections, and (ii) the fragility of selected specimens. The project explored methods that could enable students to engage with this material evidence through digital models.

Implementation

The project was primarily undertaken by four Reading students, both postgraduate and undergraduate: Rosie-May Howard (Bsc Archaeology, Part 2), Matthew Abel (BA Museum Studies & Archaeology, Part 1), Daniel O’Brien (BA Ancient History & Archaeology, Part 3), and James Lloyd (Classics, PGR). Supervision and support was provided by Prof. Amy Smith (Classics), Dr Rob Hosfield (Archaeology) and Dr Stuart Black (Archaeology). The four students undertook the following tasks:

(i) Testing the URE Museum’s NextengineTM HD 3D scanner and associated processing software ScanStudioTM to produce 3D laser scan models of selected artefacts (ceramics from the Ure Museum and stone tools from the Archaeology teaching collections).

(ii) Testing 3D printing of the laser scan models using the URE museum’s CubeProTM 3D printer.

(iii) Testing the digital representation of the same range of artefacts through photogrammetry, using memento by Autodesk.

(iv) Trialing the use of Sketchfab as a remote site for posting, storing and accessing the 3D models.

(v) Assessing student responses to the models through a Surveymonkey questionnaire.

Impact

(i) The 3D laser scan models provided volumetric data (unlike the photogrammetry models), but struggled with the regular shapes and repeating patterns which were characteristic of many of the ceramics. The laser scanning process was also time-intensive.

(ii) The laser scanner struggled to represent some of the stone artefacts, with the resulting models characterised by poorly defined edges and ‘holes’, due to the material properties of the flint raw material.

(iii) Photogrammetry was used successfully to create 3D models of ceramics from the Ure museum collection.

(iv) Sketchfab was a flexible interface for ‘touching up’ and annotating the models, and was more user-friendly than other options (e.g. scanstudio).

The quality of the 3D printing was mixed, leading to a decision during the project to focus on digital models that could be accessed on-line.

(v) Students responded positively to the virtual models, and would like to see more in future!

Sample survey questions and responses:

Q: What (if any) other objects/material types would you like to see as 3D models?

A: It would be interesting to see 3D models of smaller, more dainty objects as these can often be difficult to look at on such a small scale.

Q: Do you have any other comments?

A: This is a great project that should keep going! P.S. A scale will be helpful for accurately describing the objects. There’s a Part 2 Archaeology module called Artefacts in Archaeology and the scans could be used as an at-home resource by students.

Reflections

The project was successful in clearly highlighting the relative strengths and weaknesses of the 3D laser scan and photogrammetry methods for creating digital models of artefacts. In terms of cost and time it was clear that photogrammetry was a more effective method, while the experiments with 3D printing emphasised on-line hosts such as Sketchfab as the most effective way of disseminating the models.

More specifically, exploring the photogrammetry option highlighted the potential of the Agisoft PhotoScan software as an effective method for Museums or HEIs wishing to capture large collections for teaching and/or archiving purposes.

Student responses emphasised the importance of providing a wide range of models if these sorts of teaching resources are to be further developed.

Follow up

Archaeology has purchased copies of the Agisoft PhotoScan software and is currently looking to develop a photogrammetry-based digital database of its teaching collections.

At the Ure Museum 3D scans are being made available via Sketchfab and more thorough use of photogrammetry is being considered; virtual models of the vases scanned for CL1GH are being used in seminars this term.

Links

https://sketchfab.com/uremuseum