Engaging students in the design of assessment criteria

Dr Maria Kambouri-Danos, Institute of Education                                                                                                                m.kambouridanos@reading.ac.uk                                                                                                                                                                                            Year of activity 2017/18

Overview

I recently led a group of colleagues while working in partnership with students to develop a new module in BA in Children’s Development and Learning (BACDL) delivered at the Institute of Education (IoE). This approach to working in partnership with students is a core part of the project’s aim and the work described here has been part of a Partnerships in Learning & Teaching (PLanT) project.

Objective

The team’s aim was to develop and finalise a new module for BACDL in close partnership with the students. The new module will replace two existing modules (starting from 2018-19), aiming to reduce overall assessment (programme level), a need identified during a Curriculum Review Exercise. The objective was to adopt an inclusive approach to student engagement when finalising the new module, aiming to:

  • Go beyond feedback and engage students by listening to the ‘student voice’
  • Co-develop effective and student-friendly assessable outcomes
  • Identify opportunities for ‘assessment for learning’
  • Think about constructive alignment within the module
  • Encourage the development of student-staff partnerships

Context

To accomplish the above, I brought together five academics and six students (BACDL as well as Foundation Degree (FDCDL) students). Most of the students on this programme are mature students (i.e. with dependants) who are working full time while attending University (1 day/week). To encourage students from this ‘hard to reach group’ to engage with the activity, we secured funding through the Partnerships in Learning & Teaching scheme, which enabled the engagement of a more diverse group (Trowler, 2010).

Implementation

The team participated in four partnership workshops, during which staff and students engaged in activities and discussions that helped to develop and finalise the new module. During the first workshop, we discussed the aims of the collaborative work and went through the module’s summary, aims and assessable outcomes. We looked at the two pre-existing modules and explored merging them into a new module, maintaining key content and elements of quality. During the second workshop, we explored chapter two from the book ‘Developing Effective Assessment in Higher Education: a practical guide’ (Bloxham & Boyd, 2007), which guided the discussions around developing the assessment design for the new module.

During the third workshop, we discussed aspects of summative and formative tasks and finalised the assessment design (Knight, 2012). We then shared the new module description with the whole BACDL cohort and requested feedback, which enabled us to get other students’ views, ensuring a diverse contribution of views and ideas (Kuh, 2007). During the last workshop, with support from the Centre of Quality Support and Development (CQSD) team, we implemented a game format workshop and created a visual ‘storyboard’, outlining the type and sequence of learning activities required to meet the module’s learning outcomes (ABC-workshop http://blogs.ucl.ac.uk/abc-ld/home/abc-workshop-resources/). This helped to identify and evaluate how the new module links with the rest of the modules, while it also helped to think about practical aspects of delivering the module and ways to better support the students (e.g. through a virtual learning environment).

Photos from staff-student partnership workshops

Impact

The close collaboration within the team ensured that the student voice was heard and taken into account while developing the new module. The partnership workshops provided the time to think collaboratively about constructive alignment and ensure that the new module’s assessment enables students to learn. It also ensured that the module’s assessable outcomes are clearly defined using student-friendly language.

A pre- and post-workshop survey was used to evaluate the impact of this work. The survey measured the degree to which students appreciate the importance of providing feedback, participate in activities related to curriculum review/design, feel part of a staff-student community and feel included in developing their programme. The survey results indicate an increase in relation to all of the above, demonstrating the positive impact of activities like this on student experience. All students agreed that it has been beneficial to take part in this collaborative work, mentioning that being engaged in the process, either directly (attending the workshops) or indirectly (providing feedback) helped them to develop a sense of belonging and feel part of the community of staff and students working together (Trowle, 2010; Kuh, 2005;2007).

Reflections

This project supported the successful development of the new module, from which future students will benefit (Kuh, 2005). The work that the team produced has also informed the work of other groups within the IoE. At the institutional level, this work has supported the development of the CQSD ‘Student Engagement’ projects. All the above were achieved because of close collaboration, and could not have been done by a group of individuals working on their own (Wheatley, 2010). Because of that, our team was awarded the University Collaborative Awards for Outstanding Contributions to Teaching and Learning.

References

Bloxham, S. & Boyd, P. (2007). Developing effective assessment in higher education: a practical guide. Maidenhead: Open University Press.

Knight, P. (Ed.). (2012). Assessment for learning in higher education. Routledge.

Kuh, G.D. (2005). Putting Student Engagement Results to Use: Lessons from the Field, Assessment Update. 17(1), 12–1.

Kuh, G.D. (2007). How to Help Students Achieve, Chronicle of Higher Education. 53(41), 12–13.

Trowler, V. (2010). Student engagement literature review. The Higher Education Academy.

Wheatley, M. (2010). Finding our Way: Leadership for an Uncertain Time. San Francisco: Berrett-Koehler

THE BENEFITS OF NEW MARKS AVAILABILITY ON RISIS FOR PERSONAL TUTORING: By Dr Madeline Davies, EMA Academic, and Kat Lee (External)

The EMA Programme has delivered a new function within RISIS that allows colleagues to see their students’ sub modular marks on the Tutor Card. We have all had access to the Tutor Card for some time and it has provided an invaluable snapshot of a student’s degree history, particularly useful for writing references and for monitoring attendance. However, in terms of sub modular marks, it has always functioned retrospectively: prior to the start of the new academic year, our students’ updated assessment records from the previous session are available on the Card but they have never been available during the academic session.

The sub modular mark screens accessible via the Tutor Card mean that we will no longer have to wait until the end of the academic year to have access to our students’ assessment information and this creates a range of benefits for personal tutors in particular. Easy access to the sub modular marks will provide an early indication of any problems that our students may be having and this will allow us to address these issues in a timely manner.

The information becoming available is significantly more extensive than a list of marks alone: a series of codes is used to flag up, for example, a result involving academic misconduct or extenuating circumstances requests (scroll down the page to translate the codes via the key), and a hover function under ‘Notes’ provides submission details so that personal tutors can tell when a ‘late’ penalty has been applied or when there has been another change to a mark (see image). Any one of these situations would require personal tutor intervention but, until now, this information has not been available to us unless our tutees have chosen to disclose it in personal tutor meetings.

The new screens are, then, particularly significant for our work as personal tutors: the wealth of information made available gives tutors the means to identify and support students who are struggling before they find themselves in crisis. Proactive and early intervention is always more effective than reactive response, and the additional access to information during the year that has been made available by EMA allows us to ensure that no student falls behind without us realising it.

The new screens also connect with the University’s inclusivity agenda in that students coming to us from non-traditional educational backgrounds can need extra support in their first months with us. The screens will alert us to situations where Study Advice, or Counselling and Wellbeing, need to be consulted.

In addition, students who may be of concern in academic engagement and/or Fitness to Study processes, can be checked at every assessment point, and this will allow Senior Tutors and SDTLs the opportunity to assess a student’s ability to cope with the pressure of assessment deadlines. This in turn facilitates early intervention in problematic cases and provides an easily available record of performance in cases requiring escalation.

The role of the personal tutor primarily involves offering tutees academic advice in response to their marks, feedback and more general concerns. The addition to the Tutor Card of sub modular marks and notes during the course of the year underpins this work and creates the opportunity for meaningful discussions with our tutees. New access to this information allows us to respond to student issues ‘in real time’, thus allowing personal tutors to act as effective academic advisors, and to engage in crucial developmental dialogue with the students in our care.

To view a screencast that shows you how to navigate the sub modular mark screens on the tutor card, click https://www.screencast.com/t/sKCH4czjJ

To view a screencast that shows you how to navigate the Module Convenor Screens that are now also live, click http://www.screencast.com/t/MjCxE6UxfM

For further information on the EMA Programme, please click http://www.reading.ac.uk/ema/

Involving students in the appraisal of rubrics for performance-based assessment in Foreign Languages By Dott. Rita Balestrini

Context

In 2016, in the Department of Modern Languages and European Studies (DMLES), it was decided that the marking schemes used to assess writing and speaking skills needed to be revised and standardised in order to ensure transparency and consistency of evaluation across different languages and levels. A number of colleagues teaching language modules had a preliminary meeting to discuss what changes had to be made, what criteria to include in the new rubrics and whether the new marking schemes would apply to all levels. While addressing these questions, I developed a project with the support of the Teaching and Learning Development Fund. The project, now in its final stage, aims to enhance the process of assessing writing and speaking skills across the languages taught in the department. It intends to make assessment more transparent, understandable and useful for students; foster their active participation in the process; and increase their uptake of feedback.

The first stage of the project involved:

  • a literature review on the use of standard-based assessment, assessment rubrics and exemplars in higher education;
  • the organization of three focus groups, one for each year of study;
  • the development of a questionnaire, in collaboration with three students, based on the initial findings from the focus groups;
  • the collection of exemplars of written and oral work to be piloted for one Beginners language module.

I had a few opportunities to disseminate some key ideas emerged from the literature review – School of Literature and Languages’ assessment and feedback away day, CQSD showcase and autumn meeting of the Language Teaching Community of Practice. Having only touched upon the focus groups at the CQSD showcase, I will describe here how they were organised, run and analysed and will summarise some of the insights gained.

Organising and running the focus groups

Focus groups are a method of qualitative research that has become increasingly popular and is often used to inform policies and improve the provision of services. However, the data generated by a focus group are not generalisable to a population group as a whole (Barbour, 2007; Howitt, 2016).

After attending the People Development session on ‘Conducting Focus groups’, I realised that the logistics of their organization, the transcription of the discussion and the analysis of the data they generate require a considerable amount of time and detailed planning . Nonetheless, I decided to use them to gain insights into students’ perspectives on the assessment process and into their understanding of marking criteria.

The recruitment of participants was not a quick task. It involved sending several emails to students studying at least one language in the department and visiting classrooms to advertise the project. In the end, I managed to recruit twenty-two volunteers: eight for Part I, six for Part II and eight for Part III. I obtained their consent to record the discussions and use the data generated by the analysis. As a ‘thank you’ for participating, students received a £10 Amazon voucher.

Each focus group lasted one hour, the discussions were entirely recorded and were based on the same topic guide and stimulus material. To open discussion, I used visual stimuli and asked the following question:

  • In your opinion, what is the aim of assessment?

In all three groups, this triggered some initial interaction directly with me. I then started picking up on differences between participants’ perspectives, asking for clarification and using their insights. Slowly, a relaxed and non-threatening atmosphere developed and led to more spontaneous and natural group conversation, which followed different dynamics in each group. I then began to draw on some core questions I had prepared to elicit students’ perspectives. During each session, I took notes on turn-taking and some relevant contextual clues.

I ended all the three focus group sessions by asking participants to carry out a task in groups of 3 or 4. I gave each group a copy of the marking criteria currently used in the department and one empty grid reproducing the structure of the marking schemes. I asked them the following question:

  • If you were given the chance to generate your own marking criteria, what aspects of writing/speaking /translating would you add or eliminate?

I then invited them to discuss their views and use the empty grid to write down the main ideas shared by the members of their group. The most desired criteria were effort, commitment, and participation.

Transcribing and analysing the focus groups’ discussions

Focus groups, as a qualitative method, are not tied to any specific analytical framework, but qualitative researchers warn us not to take the discourse data at face value (Barbour, 2007:21). Bearing this in mind, I transcribed the recorded discussions and chose discourse analysis as an analytical framework to identify the discursive patterns emerging from students’ spoken interactions.

The focus of the analysis was more on ‘words’ and ‘ideas’ rather than on the process of interaction. I read and listened to the discussions many times and, as I identified recurrent themes, I started coding some excerpts. I then moved back and forth between the coding frame and the transcripts, adding or removing themes, renaming them, reallocating excerpts to different ‘themes’.

Spoken discourse lends itself to multiple levels of analysis, but since my focus was on students’ perspectives on the assessment process and their understanding of marking criteria, I concentrated on those themes that seemed to offer more insights into these specific aspects. Relating one theme to the other helped me to shed new light on some familiar issues and to reflect on them in a new way.

Some insights into students’ perspectives

As language learners, students gain personal experience of the complexity of language and language learning, but the analysis suggests that they draw on the theme of complexity to articulate their unease with the atomistic approach to evaluation of rubrics and, at times, also to contest the descriptors of the standard for a first level class. This made me reflect about whether the achievement of almost native-like abilities is actually the standard against which we want to base our evaluation. Larsen-Freeman’s (2015) and Kramsch’s (2008) approach to language development as a ‘complex system’ helped me to shed light on the idea of ‘complexity’ and ‘non-linear relations’ in the context of language learning which emerged from the analysis.

The second theme I identified is the ambiguity and vagueness of the standards for each criterion. Students draw on this theme not so much to communicate their lack of understanding of the marking scheme, but to question the reliability of a process of evaluation that matches performances to numerical values by using opaque descriptors.

The third theme that runs through the discussions is the tension between the promise of objectivity of the marking schemes and the fact that their use inevitably implies an element of subjectivity. There is also a tension between the desire for an objective counting of errors and the feeling that ‘errors’ need to be ‘weighted’ in relation to a specific learning context and an individual learning path. On one hand, there is the unpredictable and infinite variety of complex performances that cannot easily be broken down into parts in order to be evaluated objectively, on the other hand, there is the expectation that the sum of the parts, when adequately mapped to clear marking schemes, results in an objective mark.

Rubrics in general seem to be part of a double discourse. They are described as unreliable, discouraging and disheartening as an instructional tool. The feedback they provide is seen as having no effect on language development as does the complex and personalised feedback that teachers provide. Effective and engaging feedback is always associated with the expert knowledge of a teacher, not with rubrics. However, the need for rubrics as a tool of evaluation is not questioned in itself.

The idea of using exemplars to pin down standards and make the process of evaluation more objective emerges from the Part III focus group discussion. Students considered pros and cons of using exemplars drawing on the same rationales that can be found debated in scholarly articles. Listening to, and reading systematically through, students’ discourses was quite revealing and brought to light some questionable views on language and language assessment that most marking schemes measuring achievement in foreign languages contribute to promote.

Conclusion

The insights into students’ perspectives gained from the analysis of the focus groups suggest that rubrics can easily create false expectations in students and foster an assessment ‘culture’ based on an idea of learning as steady increase in skills. We need to ask ourselves how we could design marking schemes that communicate a more realistic view of language development. Could we create marking schemes that students do not find disheartening or ineffective in understanding how to progress? Rather than just evaluation tools, rubrics should be learning tools that describe different levels of performance and avoid evaluative language.

However, the issues of ‘transparency’ and ‘reliability’ cannot be solved by designing clearer, more detailed or student-friendly rubrics. These issues can only be addressed by sharing our expert knowledge of ‘criteria’ and ‘standards’ with students, which can be achieved through dialogue, practice, observation and imitation. Engaging students in marking exercises and involving them in the construction of marking schemes – for example by asking them how they would measure commonly desired criteria like effort and commitment – offers us a way forward.

References:

Barbour, R. 2007. Doing focus groups. London: Sage.

Howitt, D. 2016. Qualitative Research Methods in Psychology. Harlow: Pearson.

Kramsch, C. 2008. Ecological perspectives on foreign language education. Language Teaching 41 (3): 389-408.

Larsen-Freeman, D. 2015. Saying what we mean: Making a case for ‘language acquisition’ to become ‘language development’. Language Teaching 48 (4): 491-505.

Potter, M. and M. Wetherell. 1987. Discourse and social psychology. Beyond attitudes and behaviours. London: Sage.

 

Links to related posts

‘How did I do?’ Finding new ways to describe the standards of foreign language performance. A follow-up project on the redesign of two marking schemes (DLC)

Working in partnership with our lecturers to redesign language marking schemes 

Sharing the ‘secrets’: Involving students in the use (and design?) of marking schemes

Leaner, Cleaner, Greener: How Reading’s assessment data is changing for the better:

Leaner, Cleaner, Greener: How Reading’s assessment data is changing for the better.

Dr Emma Mayhew (EMA Academic Director), Dr Madeleine Davies (EMA Academic Partner), Kat Lee (Project Manager, External)

The Electronic Management of Assessment (EMA) Programme has been created to deliver the University’s long-term vision for online assessment while improving the underlying processes and supporting systems. The reduction in manual assessment recording is at the heart of changes being delivered this autumn by one of the Programme’s workstreams, Core Systems, which is making headway towards the ultimate aim of being able to integrate Blackboard and RISIS assessment information and marks.

The challenge for Reading is that sub modular marks calculation in RISIS needs to have full information about all assessments contributing towards an overall mark, and this is currently stored in Excel. LOTS of Excel. The biggest problem with spreadsheets is often the isolation from the rest of an organisation, making collaboration tricky: data cannot be automatically or easily incorporated into other processes or systems. UoR is not exempt from this challenge that causes multiple requests for the same/similar information on modules and assessment information throughout the academic year. This can give rise to frustration from all colleagues involved in the process and it leads to difficulties in accessing information quickly.

Over the last three months, programme administration colleagues across the University have been supporting the transition to sub modular marks by creating the starting point for detailed assessment information for UG modules running in the 2017/18 academic year. It has been a significant task, focused on the aim to create lean, green and more streamlined approaches for managing assessment and marks data.

We are now able to announce the following improvements that we are delivering for the Autumn Term:

1)      Module Convenors From the beginning of term, all module convenors for UG modules will be able to view sub modular assessment information held in RISIS for their modules. This will allow them to track their modules and to identify any problems at an earlier stage of the academic year. It will also be a one-stop resource for all module information so that queries can be answered quickly and easily simply by accessing this screen.

2)     Mark Entry Programme Administrators will be able to enter sub modular marks into RISIS for UG assessment from November onwards (where already submitted/marked). Corresponding grades will be able to show where penalties such as late deductions have been made. This allows Programme Administrators, Exams Officers and Senior Tutors to drill down into the details of students’ grades, to check the history of marks more easily, and to diagnose problems quickly.

3)     Personal Tutors

Building on the existing Tutor Card area of RISIS, additional information will be available to show the breakdown of individual, sub modular assessment marks for tutees during the course of the academic year. Previously, many colleagues had to wait until the end of the academic session to access this information and even then they may only have been able to access overall module marks. The new screen will provide current information and greatly enhanced detail (see image).

 

 

 

4)     Reporting

As well as being able to download information where required, a number of pre-defined reports will also be available to schools, providing assessment information such as submission dates and assessment types. SDTLs will, for example, be able to identify where assessment bunching occurs.

The goal is to produce a ‘cleaner’ system that is intuitive and responsive to staff and student needs. The team is working with a gradate student representative and with RUSU to obtain student perspectives on the upcoming changes and to work towards enabling a consistently good student assessment experience.

To help you find out more about the immediate benefits going live this term, the EMA Programme is running a webinar to highlight some of the changes and new RISIS screens on Monday 11th September. If you would like to sign up for the webinar, please contact the EMA team at ema@reading.ac.uk

More broadly, the team working on the EMA Programme would like all our colleagues to feel that they can share any good ideas with us and discuss any thoughts they have about the programme. If you would like to contact us, we would be delighted to hear from you. Please do e-mail EMA Academic Director Emma Mayhew (e.a.mayhew@reading.ac.uk) or Academic Partner Madeleine Davies (m.k.davies@reading.ac.uk).

 

 

 

 

 

 

 

Developing our Professional Track

Dr Cindy Becker, Literature and Languages
l.m.becker@reading.ac.uk
Year of activity: 2015-16

Overview

During the summer of 2016 we applied for £300 from the funds of Teaching and Learning Dean Dr David Carter and were awarded the full amount. We were keen to develop our professional development scheme for students in the School of Literature and Languages, the Professional Track, and we needed some external, professional input in order to do this.

Objectives

  • To use an intern for 45 hours, asking her to interview alumni and local employers about the scheme.
  • To find out how well the training courses we offer on the Professional Track are meeting the needs of employers.
  • To increase our contact with alumni and local employers for a variety of reasons.

Context

Our Professional Track gives students the opportunity to undertake certified vocational training and skills development courses alongside their degree. We needed an impartial person to find out if the range of training we offer is sufficient; we also wanted to gain interest for our planned Professional Board, an advisory team for our professional development activities.

Implementation

Our Professional Track and Placement Facilitators (Sarah Mills and Lucy Stone) prepared the way by making some initial contact with employers, and together we decided on the questions to be asked. Our intern (a Part Two student) was briefed and she then interviewed our contacts in person, by phone and via Skype. She asked whether our courses were relevant to their area of work and whether we should add more. We learned that there are some skills development areas that we do not currently cover in our training; we also discovered that we need to do as much as we can to help our students network professionally. We were delighted to find some ‘warm contacts’ for future academic placements (i.e. activities in the professional world linked to module learning) and we received offers to give Professional Masterclasses to our students. Having several contacts agree to join our Professional Board has inspired us to move ahead with this.

Impact

Next year we will introduce Professional Track courses in social media, starting your own business, journalism, teaching practices, leadership and project management. We are pleased to learn that all of the courses we currently provide are thought relevant (these include report writing, assertiveness, presentation skills, First Aid, British Sign Language, Marketing and teaching English as a foreign language).

We will stay in touch with the contacts made on the project and other professionals we have been in conversation with over the last year in order to set up our Professional Board, members of which might advise on the development of our teaching in relation to the professional readiness of our students (such as transferable skills acquisition), offer face-to-face support to our students, and get involved in the Professional Track (for example, awarding prizes or speaking at Professional Track events).

We are keen to provide some warm contacts for students wanting to undertake academic or professional placements; this project has allowed us to begin to do that.

Reflections

A relatively modest amount of money can go a long way in helping to move a project forward, but it does need careful planning and plenty of preparation if the money is to be used effectively.

Our intern wrote a full report after each interview and this was crucial in helping us to make the most of the information once she had completed the project.

We would like to do something similar during 2016-17 but we would be more ambitious in terms of our contacts. We would ensure a longer lead time before the interviewing stage of the project began so that we could line up a wide range of contacts.

We plan to keep in touch with the contacts we have made on the project and to use that network to explore more fully the ways in which local professional organisations can be of direct value to our students.

Follow up

Whilst we have found training providers for most of the courses we plan to offer next year, we are struggling to find IT training for our students, so that will be one of our tasks in the coming months. Any advice gratefully received!

Links

Professional Track website

LW2RPP – Research Placement Project

Dr. Stavroula Karapapa, Law
s.karapapa@reading.ac.uk

Overview

Research Placement Project (LW2RPP) is a module developed within the School of Law that aims to provide Part Two students with a hands-on experience of the academic research process, from the design of a project and research question through to the production of a research output. It is an optional module that combines individual student research, lectures and seminars.

Objectives

  • To provide students with a hands-on experience of the academic research process, from the design of a project and research question through to the production of a research output.
  • To provide a forum for the development of key research skills relating to the capacity to generate original knowledge.
  • To provide a forum for the development of key skills relating to the presentation of ideas in written form.
  • To give the opportunity to obtain an in-depth understanding of a specific applied topic of legal study.

Context

The module was initially developed as an alternative to Legal Writing Credit (LW2LWC) with a view to offer more optional modules to Law students at Part Two.

Implementation

The module has a unique learning design in that it introduces law students to semi-guided legal research through lectures, seminars and independent student learning. The lectures introduce students to research methods. Seminars are lead by experts in a particular area that have a strong interest in a specific topic because they currently carry out research on it. We have had a variety of topics offered throughout the four years that the module runs, spanning international law, criminal law, company law, media law, family law etc. Students are given the option to choose their group at the beginning of the academic year and to work on topics related to a specific research area.

During the module, students receive formative feedback on two occasions, as they are required to present a piece of preparatory work, such as a literature review or draft bibliography, in their second and third project supervision sessions, with these pieces forming the basis for discussion with their supervisor and with peers. Students are therefore able to use this formative feedback to direct their final output, an assessed essay of 10 pages.

Impact

The objectives of the activity have been met. Students have been acquainted with a particular research area and they have developed skills and some experience on legal research writing. Having colleagues deliver seminars on their current areas of research is valuable, as it showcases the wide variety of research in Law that takes place within the School and the subject more generally, and students respond well to this element of the module. The outputs that students produce have generally been of a good quality, and have demonstrated an ability to use appropriate methodologies to conduct and utilise independent research. Involvement in a research project of this nature at Part Two has been valuable for students to develop skills which they then continue to utilise at Part Three, particularly in their dissertation.

Reflections

The main force behind the success of the module is the contribution of the various colleagues that volunteer every year to offer some classes and group supervision to Part Two students.

Take-home exam

Stuart Lakin, Law
s.j.lakin@reading.ac.uk

Overview

In a Part Two Law module, Public Law (LW2PL2), we have moved away from the conventional exam to a take-home exam. We publish the exam paper on Blackboard at an arranged date and time. We give the students approximately 48 hours to complete and submit their answers electronically.

The impact has been entirely positive as compared to the old exam approach. Students prefer this format. The quality of their answers is markedly better. The results are better, and are consistently among the highest of all Part Two modules.

Objectives

  • To ensure that work produced in the exams is presented to a professional standard.
  • To allow students the opportunity to provide greater intellectual depth in their answers, and allowing the ability for independent research to form part of the assessment.
  • To have students demonstrate time management, in order to allow them to effectively complete their take-home exam while revising for their other examinations.

Context

We had three reasons for undertaking the activity:

First, we reasoned that LW2PL2 was better suited, pedagogically speaking, to the new format. The subject-matter is theoretical, and we assess by essay only (as opposed to by problem questions). We look for deep understanding of the issues rather than an ability mechanically to apply memorised rules. The take-home format encourages an independent research mindset.

Secondly, we thought it valuable to provide some variety in the way that Part Two students are assessed. The assessment across the Part Two modules had hitherto been by conventional exam only. Whatever the merits and demerits of the traditional exam, it can be refreshing for students to experience some other form of assessment.

Thirdly, we responded to the University call for alternative assessment. On pragmatic grounds, the take-home exam frees up room space and reduces complex timetabling requirements.

Implementation

We prepared the first cohort of students by giving them a mock take-home exam in lieu of their usual non-assessed essay. We asked them to prepare an answer to a question as if they were preparing for the exam itself. We have continued this practice ever since.

In addition, I prepared a detailed explanation of our rationales and expectations for the take-home exam, and provided this to the students. This document exists to inform students of the benefits and the opportunities provided by the format, and also ensures that they fully appreciate the assessment criteria of the format. I talk through this document with the students throughout the year.

Impact

In short, the activity has been highly successful. I believe that colleagues are considering this format for their own modules. By having students word process their exam answers, a lot of the recognised disadvantages of handwritten answers (handwriting often being slow and uncomfortable, and producing results that are messy and poorly legible, as well as the anxiety caused by these disadvantages) can be avoided. It is also easier for students to structure their essays.

By having the take-home exam scheduled during the University exam period, it is important that students manage their time effectively in completing the exam. Students are made aware that the assumption when marking is that they will have spent approximately two hours answering each question: this allows them more time than a conventional exam, but also allows time for students to make space for other commitments they might have, such as revision for other exams.

Above all, we have found that the format is a better way of encouraging scholarly engagement with the module content. We emphasise in our rationales/expectations document that the format has an element of independent research.

The level of success of the activity was unexpected. The first cohort of students to do the take-home exam were nervous and rather distrustful of the activity. Happily, they passed their positive experience down to the next year’s cohort, and that pattern has continued ever since.

Reflections

In my view, the take-home exam format treats students as independent thinkers in a way that the conventional exam does not. The emphasis is on the quality of argument and research rather than on memory skills and the ability to perform under pressure. Having said that, the new format does not entirely dispense with the latter types of skills – there is still a deadline, and students will still need to revise in advance.

There were admittedly risks involved in introducing this new format. LW2PL2 is an extremely important, compulsory module which counts towards the final degree. With hindsight, it may have been more prudent to experiment with this format in a Part One module. On the other hand, we put a great deal of thought into the format, and communicated well with the students. In these respects, we minimised the risks.

Follow up

The activity has remained largely the same as it began. We have experimented with changing the publication and submission times for the exam. We originally published the exam at midnight. This led to many students staying up all night to work on the paper. We now publish the exam at 9 am.

Take Home Exam by Dr Stuart Lakin, School of Law

This post has been uploaded to the T&L Exchange, and can now be found at:

http://blogs.reading.ac.uk/t-and-l-exchange/take-home-exam/

Exploring value co-creation (CCV) in the Law Feedback Project at ESLTIS 2016 by Imogen Moore and Laura Bennett, School of Law

Introduction

As joint staff leaders (together with Dr Nora Honkala) on the current Law Feedback Project, we recently presented a paper exploring aspects of the project to the second annual Enhancing Student Learning Through Innovative Scholarship Conference, held at University College London on 28-29 June 2016.  This blog post explains a little about the Law Feedback Project, how (and why) value co-creation principles were incorporated within it, and what we found useful at the 2016 ESLTIS conference.

The Law Feedback Project and Value Co-Creation

The Law Feedback Project was set up in September 2015, in response to Periodic Review recommendations and student feedback in the NSS and elsewhere, which while generally positive, indicated some room for improvement. Periodic Review had recommended involving students in development of feedback (and other) strategies, and this provided us with the impetus to put students at the heart of the project, supported by our Head of School, Professor Susan Breau. Rather than simply seeking student views on assessment and feedback in a way potentially driven and limited by staff preconceptions and preferences, we set up the project drawing on principles of value co-creation, as espoused by writers such as Catherine Bovill and Alison Cook-Sather (Bovill et al, 2012 & 2014; see also McWilliam, 2008; Mihans et al, 2008;  Tabrizi & Ackfeldt, 2013) .

CCV envisages students acting as partners in learning, moving beyond a consumer-oriented role, and has been successfully used with a wide range of teaching and learning projects. For the Law Feedback Project this would mean involving students from the start and throughout the project – in scoping, designing and running the project, and ultimately creating and implementing changes to policies and practice. Students were recruited on a voluntary basis, via the SSLC, to co-lead the project working group (alongside the three staff members). Additional students participated in focus groups which explored more widely and deeply the issues identified within the working group.

Our primary aim in using CCV was to lead to more meaningful assessment and feedback practice that better met student needs, while still recognising system and staffing constraints. The project showed that students had quite clear views on what they needed and what they liked and disliked. While often their views matched staff expectations, this was not always the case. Fears of some staff that students will always demand more feedback were somewhat unfounded – quality and specificity were favoured over quantity (although quantity mattered too). Importantly the project indicated that students did not always understand and share the language of assessment and feedback, suggesting student dissatisfaction with feedback is sometimes due to miscommunication rather than deeper failings. Involving students through CCV will assist in finding a common language for our discourse with students and allow us to identify ways to improve their assessment literacy.

ESLTIS Conference 2016

The paper was well received at the ESLTIS conference, and was followed by some interesting discussion relating to our experiences and the challenges and benefits presented by CCV. It was valuable to have the input of fellow teaching-intensive colleagues from a wide variety of institutions and disciplines, in such a supportive and thoughtful atmosphere. In total the conference was attended by well over 100 teaching focused staff from institutions across the UK and further afield, with representation from all career levels.

There were two excellent keynote speeches. The first was given by from Dr Dilly Fung of UCL, who spoke around her recent HEA publication ‘Rewarding educators and education leaders in research-intensive universities’. Her vision of what education means – and its depth and breadth beyond ‘just’ teaching – was particularly interesting. Professor Carol Evans of the University of Southampton gave the keynote address on the second day: ‘Developing and implementing holistic assessment practice’. Professor Evans looked at bringing together different aspects of good assessment practice, including the importance of students understanding the assessment and feedback – something with obvious links to our own project. The rest of the two days offered a multitude of papers under themes of assessment and feedback, scholarship of teaching and learning, supporting students, and the role of teaching-focused academics – so many stimulating ideas and new approaches to old (and new) problems. We were also treated to an entertaining panel discussion which gave insights into different institutions’ attitudes to teaching-focused staff.

Conclusion

The experience of running the project, and presenting at the conference, has been very rewarding. Following a CCV approach has taken us out of our comfort zone and added another dimension to our teaching and learning, and it was interesting to explore with others how to successfully involve students further in teaching design. As far as the project is concerned, it is hoped this will continue into 2016-7 (with some change of membership due to staff changes and student graduations), to develop and implement policies and assessment criteria in partnership with students. As for ESLTIS – well, the next conference, which is organised through the Teaching Focussed Academic Network, will be held in Newcastle in the summer of 2017; hope to see you there!

Group work: students’ solutions to the challenges by Sonia Hood

Group work is an integral part of assessment at university but students rarely arrive equipped with the skills, experience and knowledge to deal with the challenges they face when working in groups. This can be a cause of anxiety for students and also a time consuming intervention for lecturers.

Henley Business School approached Study Advice for help in supporting students with this form of assessment. It was felt that students needed help navigating the wide range of resources available to them. In addition, in order to offer effective support, we felt we first needed to understand the challenges students face, how they have/intend to overcome these and how best they would like to be supported in doing this. A project was set up and we received TLD funding to investigate this further.

The project had two main aims: the first to create a bank of resources that students working on assessed group work could be directed to. The second was to recommend some interventions to support students with the challenges they faced when working in groups.

The research

A student researcher was employed to evaluate the wealth of group work resources openly available. This resulted in a folder of group work resources being created and uploaded onto Blackboard.  In addition a pack containing key resources was compiled and handed out to part 1 REP students when commencing their first group work project. We were able to evaluate the effectiveness of this pack within this research.

A range of focus groups and in-depth interviews were conducted with Real Estate and Planning students, and HBS staff , over the past year. They explored both the perceived challenges to group work and the proposed solutions to these challenges. This qualitative data was then analysed and a number of key challenges, possible solutions and recommendations were presented to Real Estate and Planning teaching and learning staff.

What students want

The interviews and focus groups revealed the complex challenges associated with group work, supporting previous research into this area. Solutions varied between the PG and UG students, though both recognised that effective teams take time to get to know each other informally. Students suggested that informal events could be organised as part of their course to help them through this ‘forming’ stage. PG students also asked for careful consideration of how the mark for group work is allocated (with a higher proportion allocated to individual work) and for a penalty to be imposed, as a last resort.

More support was requested in dealing with conflict and difficult team members, and the need for more self-reflection from everyone within the group was identified. There are also some simple things we can do to help students with the practicalities of group work, like timetabling group work sessions and  booking rooms at set times for students to use. In terms of tutor support, it was recognised that their time was limited; when it comes to personal issues within a group, speaking to a mentor (like a part 2 student) who could offer confidential, impartial advice would be a preferable option for UGs.

Resources for your students

We now have a bank of resources to support students with group work, available on Blackboard, which can be copied into any course. The resources are clearly divided into folders and contain a mixture of: video tutorials; advice on dealing with challenging situations; self-reflection tools and group assessment questionnaires. The initial pack handed out to part 1 students proved to be useful for UGs, mainly as an aid to focus early group discussions. It contained some forms to record minutes, ground rules, contact details and roles, as well as offer advice to the common issues experienced within groups

Work continues on this project, as at present we are only just starting to disseminate the findings. Whilst the recommendations might not be relevant to all engaged in group work, a number of themes and challenges are shared across a variety of disciplines. We would welcome speaking to anyone who is interested in finding out more about this project and how they might benefit from this research.