Using Psychological Techniques to get the most out of your Feedback

Zainab Abdulsattar (student – Research Assistant), Tamara Wiehe (staff – PWP Clinical Educator) and Dr Allán Laville, a.laville@reading.ac.uk, (Dean for D&I and Lecturer in Clinical Psychology). School of Psychology and CLS.

Overview

To help Part 3 MSci Applied Psychology students address the emotional aspect of engaging with and interpreting assessment feedback, we have created a Blackboard feedback tool, which draws on self-help strategies used in NHS Mental Health services. This was a TLDF funded project by CQSD and we reflect upon the usefulness of the tool in terms of helping students manage their assessment feedback in a more positive and productive way for both now and the future.

Objectives

  • To explore the barriers to interpreting and implementing feedback through the creation of a feedback-focused tool for Blackboard
  • To transfer aspects of NHS self-help strategies to the tool
  • To acknowledge the emotional aspect of addressing assessment feedback in Higher Education
  • To support students to engage effectively with feedback

Context

Assessment and feedback are continually rated as the lowest item on student surveys despite efforts from staff to address this. Whilst staff can certainly continue to improve on their practices surrounding providing feedback, our efforts turned to how we could improve student engagement in this area. Upon investigation of existing feedback-focused tools, it has become apparent that many do not acknowledge the emotional aspect of addressing assessment feedback. For example, the ‘Development Engagement with Feedback Toolkit (DEFT)’ has useful components like a glossary helping students with academic jargon, but it does not provide resources to help with feedback related stress. The aim was to address the emotional aspect of interpreting feedback in the form of a self-help tool.

Implementation

 Zainab Abdulsattar’s experience:

Firstly, we carried out a literature review on feedback in higher education and the use of self-help resources like cognitive restructuring within the NHS used to treat anxiety and depression. These ideas were taken to the student focus group: to gather students’ thoughts and opinions on what type of resource they would like to help them understand and use their feedback.

Considering ideas from the literature review and the focus group, we established the various components of the tool: purpose of feedback video, problem solving and cognitive restructuring techniques, reflective log and where to go for further support page. Then, we started the creation of our prototype Blackboard tool. At tool creation stage, we worked collaboratively with the TEL team (Maria, Matt and Jacqueline) to help format and launch the tool. Upon launch, students were given access to the tool via Blackboard and a survey to complete once they had explored and used the tool.

Impact

Our prototype Blackboard tool met the main objective of the project, to address the emotional aspect of the interpreting assessment feedback. The cognitive restructuring resource aimed to identify, challenge and re-balance students negative or stressful thoughts related to receiving feedback. Some students reported in the tool survey that they found this technique useful.

As well as this, the examples seemed to help students link their past experiences of not getting a good grade. Students also appreciated the interactive features like the video of the lecturer [addressing the fact that feedback is not a personal attack] and were looking forward to the tool being fully implemented during their next academic year. Overall, the student survey was positive with the addition of some suggestions such as making the tool smart phone friendly and altering the structure of the main page for ease of use.

Reflections

Zainab Abdulsattar’s reflections:

The success of the tool lied in the focus group and literature review contributions because the students’ focus group tool ideas helped to further contribute to the evidence-based self-help ideas gathered from the latter. Importantly, the hope is that the tool can act as an academic aid promoting and improving students’ independence in self-managing feedback in a more positive and productive way. Hopefully this will alleviate feedback-related stress for both now and the future in academic and work settings.

Follow up

In the future, we hope to expand the prototype tool into a more established feedback-focused tool. To make the tool even more use-friendly, we could consider improving the initial main contents page. For example, presenting the options like ‘I want to work on improving x’ then lead on to the appropriate self-help resource instead of simply starting with the resource options [e.g. problem solving, reflective log].

Developing and embedding electronic assessment overviews

Dr Allán Laville, a.laville@reading.ac.uk , Chloe Chessell and Tamara Wiehe

Overview

To develop our assessment practices, we created electronic assessment overviews for all assessments in Part 3 MSci Applied Psychology (Clinical) programme. Here we reflect on the benefits of completing this project via a student-staff partnership as well as the realised benefits for students.

Objectives

  • To create electronic assessment overviews for all 8 assessments in Part 3 MSci Applied Psychology (Clinical).
  • To create the overviews via a student-staff partnership with Chloe Chessell. Chloe is a current PhD student and previous MSci student.

Context

The activity was undertaken due to the complexity of the Part 3 assessments. In particular, the clinical competency assessments have many components and so, only providing an in-class overview has some limitations. The aim was for students to be able to review assessment overviews at any time via Blackboard.

Implementation

Allán Laville (Dean for Diversity and Inclusion) and Tamara Wiehe (MSci Clinical Educator) designed the electronic assessment overview concept and then approached Chloe Chessell to see whether she wanted to take part in the development of these overviews. It was important to include Chloe here as she has lived experience of completing the programme and therefore, can offer unique insight.

Chloe Chessell’s experience

The first stage in assisting with the development of electronic assessment resources for MSci Applied Psychology (Clinical) students involved reflecting upon the information my cohort was provided with during our Psychological Wellbeing Practitioner (PWP) training year. Specifically, this involved reflecting upon information about the assessments that I found particularly helpful; identifying any further information which would have benefitted my understanding of the assessments; and suggesting ways to best utilise screencasts to supplement written information about the assessments. After providing this information, I had the opportunity to review and provide feedback on the screencasts which had been developed by the Clinical Educators.

Impact

Chloe shares her view of the impact of completing this activity:

The screencasts that have been developed added to the information that I had as a student, as this format allows students to review assessment information in their own time, and at their own pace. Screencasts can also be revisited, which may help students to ensure they have met the marking criteria for a specific assessment. Furthermore, embedded videos/links to information to support the development of key writing skills (e.g. critical analysis skills) within these screencasts expand upon the information my cohort received, and will help students to develop these skills at the onset of their PWP training year.

Reflections

Staff reflections: The student-staff partnership was key to the success of the project as we needed to ensure that the student voice was at the forefront. The electronic assessment overviews have been well received by students and we are pleased with the results. Based on this positive experience, we now have a further 4 student-staff projects that are currently being completed and we hope to publish on the T&L Exchange in due course.

Chloe Chessell’s reflections:

I believe that utilising student-staff partnerships to aid course development is crucial, as it enables staff to learn from student’s experiences of receiving course information and their views for course development, whilst ensuring overall course requirements are met. Such partnerships also enable students to engage in their course at a higher level, allowing them to have a role in shaping the course around their needs and experiences.

Follow up

In future, we will aim to include interactive tasks within the screencasts, so students can engage in deep level learning (Marton, 1975). An example could be for students to complete a mind map based on the material that they have reviewed in the electronic assessment overview.

‘A-level Study Boost: Unseen Poetry and the Creative Process’: an online course

Rebecca Bullard, School of Literature and Languages, r.bullard@reading.ac.uk

Overview

‘A-level Study Boost: Unseen Poetry and the Creative Process’ is a two-week online course created by staff and students in the Department of English Literature and the Online Courses team, and hosted on the social learning platform, FutureLearn. It engages a global audience of learners in reading, writing, discussing, and enjoying poetry.

Objectives

The analysis of poetry, sometimes called ‘close reading’ or ‘practical criticism’, is a core skill for the study of English Literature. This course aims to develop this skill in pre- and post-A-level students of English Literature in ways that supplement teaching in schools and FE colleges. In doing so, it encourages students to make a successful transition from A-level to university-level study of English and Creative Writing.

Context

The Online Courses team at UoR approached colleagues in the Department of English Literature to work with them to develop a course that would connect students’ pre-university learning with their studies at UoR. The resulting online course develops learners’ subject-specific skills and gives them insight into what studying English and Creative Writing at university level might be like.

Implementation

Staff in the Online Courses team and Department of English Literature worked together to combine their diverse areas of expertise. Yen Tu, Digital Learning Producer, supported by Sarah Fleming, Assistant Digital Learning Producer, ensured that the course reflects best practice in the pedagogy of online social learning (Sharples 2018; Laudrillard 2014). Rebecca Bullard, as subject specialist, wrote the articles and designed tasks and activities to develop learners’ creative and critical skills.

It took about six months of intensive collaboration to produce the course materials. The first live run of the course took place over two weeks in December 2019. Rebecca and a team of student mentors engaged with learners on the FutureLearn platform throughout the live run to facilitate social learning and encourage completion of the course. The course content, feedback and statistics are currently being evaluated in order to measure impact and inform the next run.

Impact

The impact of the initial run of this course can be evaluated using the UoR Evaluation and Impact Framework (L1: Reach, L2: Reaction, L3: Learning, L4: Behaviour), using course analytics and comments from learners. Some participants gave permission for us to use their comments; where permission was not explicitly given, comments have been paraphrased:

L1: c. 1970 learners from over 100 countries enrolled on the first live run of this course. Comments on completing the course included the following:

L2: “I have always loved poetry but found some modern poems inaccessible. This course [has] shown me some ways to gain access.”

L3/4: “I’m a school teacher, having to teach unseen texts next year. This course has made me enjoy reading and dissecting poetry and I hope that I’ll succeed in inspiring my students to do the same.”

L3/4: One learner commented that the course has changed her perspective on poetry and that she is considering applying to UoR as a result of this course.

Reflections

The success of the course emerged out of the different kinds of collaboration that it involved and encouraged:

Staff-student: The course highlighted the expertise of UoR staff and students, The course videos showcase real teaching methods that are used in the Department of English Literature, and offer tangible evidence of the academic excellence and the outstanding learning experience that underpin the UoR T&L Strategy 2018-21. Current students were paid to work as mentors on the course, giving them confidence in their own expertise.

English Literature-Creative Writing: The course engages learners in both critical analysis and creative practice, reflecting research that indicates the close relationship between these different methods of approaching literary studies (Lockney and Proudfoot 2013).

Department of English Literature-Online Courses: Specialists in both areas drew on their different kinds of expertise to develop a structure, set of activities, tone and style for the course that encourage maximum engagement from learners.

Learner-Educator-Mentor: The social learning platform FutureLearn facilitates active, real-time conversations between Learners, Educators and Mentors, which strengthens and deepens their engagement with the course material.

Follow up

During 2020, further research will be undertaken to evaluate the impact of the course on particular learner groups. The Online Courses team will run a research study to evaluate how teachers (including those in WP areas) are using the course in their teaching. The Department of English Literature will evaluate the impact of the course on students enrolled on EN1PE: Poetry in English.

‘Unseen Poetry’ will be an exemplar for a new ‘A-Level Study Boost’ series which will be rolled out to other Schools across UoR.

Links

‘A-level Study Boost: Unseen Poetry and the Creative Process’: https://www.futurelearn.com/courses/a-level-study-unseen-poetry

References

Laudrillard, Diana. 2014. Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies. Abingdon: Routledge.

Lockney, K. & K. Proudfoot. 2013. ‘Writing the unseen poem: Can the writing of poetry help to support pupils’ engagement in the reading of poetry?’ English in Education 47:2, 147-162.

Sharples, M. 2018. The Pedagogy of FutureLearn: How our learners learn. https://about.futurelearn.com/research-insights/pedagogy-futurelearn-learners-learn

Piloting General Practice (GP) experiential learning for MPharm Year 3 students

Catherine Langran, Lecturer in Pharmacy Practice, School of Pharmacy

Daniel Mercer & Selen Morelle, MPharm Part 4 students, School of Pharmacy

Background

Throughout the Masters of Pharmacy degree (MPharm) students undertake experiential learning in hospital and community pharmacies. Experiential learning through placements is an important approach to teaching and learning; providing a safe learning environment for students, bridging the gap between theory and practice, and encouraging independent learning and reflective practice.

In 2016, the National Health Service (NHS) launched a programme “Building the General Practice Workforce” creating a new career pathway for pharmacists performing clinical tasks in a primary care setting. Over the past 3 years a steadily increasing number of pharmacists are pursuing this career option, and this is now a graduate opportunity for our MPharm students.  It is therefore crucial that Reading School of Pharmacy provides undergraduate students with an opportunity to experience this new role to give students more insight into their career options, encourage professional and personal development, and boost employability.

This collaborative partnership project piloted placements within GP practices for Part 3 pharmacy students to assess the students’ perceptions and evaluate the benefits and practicality of the placements.

Method

59 Part 3 students (46% of the cohort) attended a voluntary session in November 2018, prior to submitting the PLanT application. This session demonstrated a high level of student interest in this placement opportunity and also involved discussion of the practicalities (e.g. placement length, positioning within timetable, location) and perceived advantages of offering GP placements.

Following a successful bid to the PLanT fund, a second voluntary session was attended by 22 students who collaboratively worked with the project lead to determine the process of student recruitment and allocation to placements, define the placement learning outcomes, placement activities, evaluation methods and how to collect feedback. Subsequently, the two project lead students worked with the lead academic to construct an online application process, review student applications, finalise the student handbook and evaluate the student feedback.

The main objectives of this project were:

  • To evaluate the benefits of undertaking the GP placements for MPharm students.
  • To evaluate the placement provider’s feedback on the acceptability, practicality and scalability of providing placements for students.

Five GP practices were recruited to take part in the pilot, located in Reading and London. From April-June 2019, a total of 37 part 3 MPharm students completed a half to one day placement in one of five GP practices. Students predominately shadowed the GP Pharmacist within a clinic environment, and others had the opportunity to shadow GPs, nurses, physician associates and reception teams to provide a greater understanding on how General Practices function as a business.

Data was collected via student completion of online questionnaires pre and post GP placements to compare their:

  • Understanding of the role of GP pharmacists and how GP surgeries work (with 0=no knowledge to 10=complete knowledge)
  • Confidence building rapport and being empathetic when talking to patients (0=no confidence to 10=fully confident)

Students also decided that they would like to prepare and deliver a short 5-minute verbal presentation to their peers and the project group to share experiences and insights from their GP placement.

We also collected feedback from placement providers after completion of the placements.

Results

37 students completed the pre-placement questionnaire, and 30 students completed the post-placement questionnaire. Analysis of the data shows that the students who undertook the placement displayed a significant improvement in their understanding of the GP pharmacist role and the structure and running of a GP practice. A moderate increase in empathy and building rapport was also seen.

Students’ evaluation of the GP placements were overwhelmingly positive, highlighting improved knowledge of the role of GP pharmacists and having gained insight into their potential career choices:

 

In their peer presentations, students described key learning points:

–  An understanding of how different health care professionals skills can work together to offer best care to patients

– The value of observing pharmacist consultations with patients, and reflecting on how treatment decisions are made

– An increased understanding of the options available to them after graduation, enabling them to make a more informed career choice.

Feedback from placement providers showed they found hosting the placement enjoyable/rewarding, they felt the students were enthusiastic, and the organisation/communication from the university was excellent.

Limitations

Whilst the cohort of students who attended the placement days appear to have improved their understanding of GP pharmacy, we are aware that the students undertook the placements voluntarily. These students had a desire to explore the role of GP pharmacists and this implies that they had an interest in the area prior to undertaking the placement. Therefore, opinions may be favoured towards the role.

Impact

The student co-design element ensured this pilot delivered an authentic and valuable experience, with high levels of student engagement.

As a result of this pilot, funding has been secured from our Head of Department to implement GP placements for all part 3 students (cohort size 106) from December 2019. Working partnerships have been established with the 5 GP practices and this has now been expanded to 16 GP practices for 2019/2020. Embedding GP placements for our students will have a positive impact on the MPharm re-accreditation by our regulators the General Pharmaceutical council in March 2020.

There is the potential for this project to have a long term impact on NSS and employability which will be explore in June 2020. Offering these placement sets us apart from other Schools of Pharmacies, and is a key selling point in our new UCAS brochure.

‘How did I do?’ Finding new ways to describe the standards of foreign language performance. A follow-up project on the redesign of two marking schemes (DLC)

Rita Balestrini and Elisabeth Koenigshofer, School of Literature and Languages, r.balestrini@reading; e.koenigshofer@reading.ac.uk

Overview

Working in collaboration with two Final Year students, we designed two ‘flexible’, ‘minimalist’ rubric templates usable and adaptable across different languages and levels, to provide a basis for the creation of level specific, and potentially task specific, marking schemes where sub-dimensions can be added to the main dimensions. The two marking templates are being piloted this year in the DLC. The project will feature in this year’s TEF submission.

Objectives

Design, in partnership with two students, rubric templates for the evaluation and feedback of writing tasks and oral presentations in foreign languages which:

  • were adaptable across languages and levels of proficiency
  • provided a more inclusive and engaging form of feedback
  • responded to the analysis of student focus group discussions carried out for a previous TLDF-funded project

Context

As a follow-up to a teacher-learner collaborative appraisal of rubrics used in MLES, now DLC, we designed two marking templates in partnership with two Final Year students, who had participated in the focus groups from a previous project and were employed through Campus Jobs. ‘Acknowledgement of effort’, ‘encouragement’, ‘use of non-evaluative language’, ‘need for and, at the same time, distrust of, objective marking’ were recurrent themes that had emerged from the analysis of the focus group discussions and clearly appeared to cause anxiety for students.

Implementation

We organised a preliminary session to discuss these findings with the two student partners. We suggested some articles about ‘complexity theory’ as applied to second language learning, (Kramsch, 2012; Larsen-Freeman, 2012; 2015a; 2015b; 2017) with the aim of making our theoretical perspective explicit and transparent to them. A second meeting was devoted to planning collaboratively the structure of two marking schemes for writing and presentations. The two students worked independently to produce examples of standard descriptors which avoided the use of evaluative language and emphasised achievement rather than shortcomings. At a third meeting they presented and discussed their proposals with us. At the last meetings, we continued working to finalise the templates and the two visual learning charts they had suggested. Finally, the two students wrote a blog post to recount their experience of this collaborative work.

The two students appreciated our theoretical approach, felt that it was in tune with their own point of view and that it could support the enhancement of the assessment and marking process. They also found resources on their own, which they shared with us – including rubrics from other universities. They made valuable suggestions, gave us feedback on our ideas and helped us to find alternative terms when we were struggling to avoid the use of non-evaluative language for our descriptors. They also suggested making use of some visual elements in the marking and feedback schemes in order to increase immediateness and effectiveness.

Impact

The two marking templates are being piloted this year in the DLC. They were presented to colleagues over four sessions during which the ideas behind their design were explained and discussed. Further internal meetings are planned. These conversations, already begun with the previous TLDF-funded project on assessment and feedback, are contributing to the development of a shared discourse on assessment, which is informed by research and scholarship. The two templates have been designed in partnership with students to ensure accessibility and engagement with the assessment and feedback process. This is regarded as an outstanding practice in the ‘Assessment and feedback benchmarking tool’ produced by the National Union of Students and is likely to feature positively in this year’s TEF submission.

Reflections

Rubrics have become mainstream, especially within certain university subjects like Foreign Languages. They have been introduced to ensure accountability and transparency in marking practices, but they have also created new problems of their own by promoting a false sense of objectivity in marking and grading. The openness and unpredictability of complex performance in foreign languages and of the dynamic language learning process itself are not adequately reflected in the detailed descriptors of the marking and feedback schemes commonly used for the objective numerical evaluation of performance-based assessment in foreign languages. As emerged from the analysis of focus group discussions conducted in the department in 2017, the lack of understanding and engagement with the feedback provided by this type of rubrics can generate frustration in students. Working in partnership with them, rather than simply listening to their voices or seeing them as evaluators of their own experience, helped us to design minimalist and flexible marking templates, which make use of sensible and sensitive language, introduce visual elements to increase immediateness and effectiveness, leave a considerable amount of space for assessors to comment on different aspects of an individual performance and provide ‘feeding forward’ feedback. This type of ‘partnership’ can be challenging because it requires remaining open to unexpected outcomes. Whether it can bring about real change depends on how its outcomes are going to interact with the educational ecosystems in which it is embedded.

Follow up

The next stage of the project will involve colleagues in the DLC who will be using the two templates to contribute to the creation of a ‘bank’ of descriptors by sharing the ones they will develop to tailor the templates for specific stages of language development, language objectives, language tasks, or dimensions of student performance. We also intend to encourage colleagues teaching culture modules to consider using the basic structure of the templates to start designing marking schemes for the assessment of student performance in their modules.

Links

An account written by the two students partners involved in the project can be found here:

Working in partnership with our lecturers to redesign language marking schemes

The first stages of this ongoing project to enhance the process of assessing writing and speaking skills in the Department of Languages and Cultures (DLC, previously MLES) are described in the following blog entries:

National Union of Students 2017. The ‘Assessment and feedback benchmarking tool’ is available at:

http://tsep.org.uk/wp-content/uploads/2017/07/Assessment-and-feedback-benchmarking-tool.pdf

References

Bloxham, S. 2013. Building ‘standard’ frameworks. The role of guidance and feedback in supporting the achievement of learners. In S. Merry et al. (eds.) 2013. Reconceptualising feedback in Higher Education. Abingdon: Routledge.

Bloxham, S. and Boyd, P. 2007. Developing effective assessment in Higher Education. A practical guide. Maidenhead: McGraw-Hill International.

Bloxham, S., Boyd, P. and Orr, S. 2011. Mark my words: the role of assessment criteria in UK higher education grading practices. Studies in Higher Education 36 (6): 655-670.

Bloxham, S., den-Outer, B., Hudson J. and Price M. 2016. Let’s stop the pretence of consistent marking: exploring the multiple limitations of assessment criteria. Assessment in Higher Education 41 (3): 466-481.

Brooks, V. 2012. Marking as judgement. Research Papers in Education. 27 (1): 63-80.

Gottlieb, D. and Moroye, C. M. 2016. The perceptive imperative: Connoisseurship and the temptation of rubrics. Journal of Curriculum and Pedagogy 13 (2): 104-120.

HEA 2012. A Marked Improvement. Transforming assessment in HE. York: The Higher Education Academy.

Healey, M., Flint, A. and Harrington K. 2014. Engagement through partnership: students as partners in learning and teaching in higher education. York: The Higher Education Academy.

Kramsch, C. 2012. Why is everyone so excited about complexity theory in applied linguistics? Mélanges 33: 9-24.

Larsen-Freeman, D. 2012. The emancipation of the language learner. Studies in Second Language Learning and Teaching. 2(3): 297-309.

Larsen-Freeman, D. 2015a. Saying what we mean: Making a case for ‘language acquisition’ to become ‘language development’. Language Teaching 48 (4): 491-505.

Larsen-Freeman, L. 2015b. Complexity Theory. In VanPatten, B. and Williams, J. (eds.) 2015. Theories in Second Language Acquisition. An Introduction. New York: Routledge: 227-244.

Larsen-Freeman, D. 2017. Just learning. Language Teaching 50 (3): 425-437.

Merry, S., Price, M., Carless, D. and Taras, M. (eds.) 2013. Reconceptualising feedback in Higher Education. Abingdon: Routledge.

O’Donovan, B., Price, M. and Rust, C. 2004. Know what I mean? Enhancing student understanding of assessment standards and criteria. Teaching in Higher Education 9 (3): 325-335.

Price, M. 2005. Assessment standards: the role of communities of practice and the scholarship of assessment. Assessment & Evaluation in Higher Education 30 (3): 215-230.

Sadler, D. R. 2009. Indeterminacy in the use of preset criteria for assessment and grading. Assessment and evaluation in Higher Education 34 (2): 159-179.

Sadler, D. R. 2013. The futility of attempting to codify academic achievement standards. Higher Education 67 (3): 273-288.

Torrance, H. 2007. Assessment as learning? How the use of explicit learning objectives, assessment criteria and feedback in post-secondary education and training can come to dominate learning. Assessment in Education 14 (3): 281-294.

VanPatten & J. Williams (Eds.) 2015. Theories in Second Language Acquisition, 2nd edition. Routledge: 227-244.

Yorke, M. 2011. Summative assessment dealing. Dealing with the ‘Measurement Fallacy’. Studies in Higher Education 36 (3): 251-273.

Embedding Employability Through Collaborative Curriculum Design

Embedding Employability Through Collaborative Curriculum Design

Name/School/ Email address

Amanda Millmore / School of Law / a.millmore@reading.ac.uk

Overview

This is a practical case study focusing upon the process of carrying out a collaborative partnership project with students to embed employability attributes into a trailblazing new module option for 2019/20 LW3CFS: Children, Families and the State.  This module is unique in that it is the first to embed employability attributes and skills within the module design. This project built upon previous work within the School of Law, which identified (by working with multiple stakeholders - students, staff and employers) 11 key employability attributes of a Reading Law graduate.

Not only do we now have a module with employability attributes built-in, but the student partners have gained a range of employability skills themselves by virtue of their involvement in the process. The student partners co-designed the module assessments, ran the student focus groups and presented the project at a number of national teaching and learning conferences this year. PLanT project funding was awarded and used to provide refreshments for focus groups and to enable students to travel to conferences to disseminate the project.

Objectives

I identified 3 key challenges that the project aimed to address:

  • Employability - how to equip students with the skills and attributes to succeed in employment.
  • Curriculum Design - how to embed those graduate employability attributes into a module.
  • Student Engagement and Collaboration - how to work effectively with students in partnership.

Context

In Law the professional pathways to careers are changing, with new routes opening up for vocational post-graduate and non-graduate training. These changes are raising questions for university law schools as to how much they should be focusing upon more practical and vocational skills.

My colleague Dr. Annika Newnham and I wanted to develop a new final year module, covering a discrete area of family law, closely allied to the kind of work that students may encounter in their early years of legal practice, with assessments mapped to legal employability skills. The brief was to design assessments for this new module which were mapped to legal employability skills and I looked to see how I could incorporate the student voice within the design process, deciding to engage them in the project as collaborative partners.

Implementation

Evaluation

Student views of their involvement in focus groups and as part of the core partnership group were sought throughout the project. All felt that this was a positive experience and welcomed the partnership and mapping of employability attributes.

Evaluation of the effectiveness of embedding employability into the module will be considered during the course of the running of the module. In addition to explicitly highlighting the attributes within the course materials and teaching, I intend to get the students to self-evaluate their awareness of and confidence in displaying the attributes at the start and again at the end of the module. I am also considering ways to utilise the assessed evaluative report to encourage reflection upon employability attributes. If the students will permit, I would also be interested to maintain contact with the students post-graduation to follow-up whether these skills have assisted them in their further study and careers.

Impact

Employability: The student partners have all developed employability skills from their involvement, in particular improved confidence, communication skills and leadership skills. These skills have been highlighted most through the opportunities that they have had to disseminate the project at national conferences.The wider student body has increased awareness of employability attributes.

Curriculum Design: The new module LW3CFS Children, Families and the State has student-designed assessments with employability attributes clearly mapped to them. Students involved have gained a greater understanding of the process of module design. The students acknowledged that this was a way for their opinions to be listened to, and for them to influence their own university experience, “University can be a very impersonal experience - it is always good to feel that your voice is being heard and that you can make an active impact on uni life and module development” (focus group participant). The module is oversubscribed in 2019/20 and is operating a waiting list. The high level of student interest (approximately 20% of the cohort have selected the module, which is significant given the rather niche subject area) is indicative of the support by students for the nature and timing of the assessments and an implicit endorsement of the staff-student partnership process.

Student Engagement & Collaboration: Students feel that they have been listened to, and been treated as true equitable partners in the process which embodies the University of Reading’s “Principles of Partnership” (2019). This has created greater feelings of community and power-sharing within the School of Law. The equitable nature of the power-sharing between staff and students was fundamental to the success of the project. This experience has been transformative for me as an academic, seeing how positively these students relished the challenge of collaboration, and became true partners in co-designing assessments. It has inspired me to look to other areas of my teaching practice to consider how I can partner with students to improve the student experience and student support in addition to classic teaching and learning activities.Students are interested in extending this trailblazing process to other modules, and colleagues and I are looking at expanding it to programme level.

Student Feedback: The following quotes are reflections from the student partners on the project:

"With all the discussions, I gained knowledge about the employability skills (communication, team work, problem solving, planning and organising, self-management, learning, research and analysis and the list goes on) and will take active actions to try to improve those skills in the future. I think I gained a lot of experience in involving in this project that I can put into practice into future projects or career as well."

"I am really looking forward and excited to learn about this module that I helped create. I think the School should definitely use this approach more often on other modules as a lot of the time when students are not satisfied/happy about how a module (or lecturer) we do not have much chance to voice out our opinions and make changes, so it is a good way to avoid that situation fundamentally. As students are likely to go into law practice after graduating, it is important to not only have essay or written examinations (that do not reflect real life law practice) as assessments. It’s really different to be good in examinations and to be good in practice."

Reflections

When I presented this project at the Advance HE conference in July 2019 I emphasised my 4 step plan for successful staff-student partnerships:

The partnership can relate to a discrete area of a project (in our case this was in relation to assessment design), and this fits well with Bovill’s (2017) ladder of participation. Once the boundaries of the project are clear, then it is vital to take a step back and relinquish control.

By keeping the student-staff partnership limited to a discrete area of module design (assessments) the boundaries were clear, and students could be given greater control. The key message is that equality of arms is vital, all viewpoints need to be welcomed and considered with no obvious staff-student hierarchy.

The limitations of the project were that it was focusing upon the modular level, rather than anything broader, so its impact is limited to that module, although the goodwill that it has generated amongst our students extends far beyond this single module.

A staff-student partnership needs to be approached with an equality of arms, so that all viewpoints are welcomed and considered, with no obvious hierarchy. As my student partner when presenting at the Advance HE conference said “For me personally as a student, you’re very much stuck in this kind of limbo where you’re not quite respected as an adult, but you’re not a child either...I’m an adult but not as respected as I would like to be in a professional environment. I wasn’t treated like that, I was treated as a complete equal and had the chance to run with my ideas, which was really important to me.

Follow up

The module is due to run for the first time in 2019/20 for Final Year students in the School of Law.

My current plans for follow-up relate to the following areas:

  1. Further evaluation of the effectiveness of embedding employability attributes into a module (see evaluation section above).
  2. Consideration of better ways to highlight the employability attributes, for example by badging them (opening up possibilities for inter-disciplinary collaborations with creative colleagues and students.
  3. The success of this staff-student partnership has highlighted how this process could be scaled up to programme level within the School of Law. This is particularly in the light of reviews of the LLB programme within the context of the University of Reading’s Curriculum Framework review process and with an eye to the forthcoming changes to the professional vocational training at postgraduate level for lawyers. One of the challenges will be how we can widen and diversify the range of students in future curriculum design partnerships.

TEF

TQ1-5, SO1-3.

Links and references

ADVANCE HE 2016. Framework for embedding employability in higher education. Available from: https://www.heacademy.ac.uk/knowledge-hub/framework-embedding-employability-higher-education.

ADVANCE HE 2016. Framework for student engagement through partnership. Available from: https://www.heacademy.ac.uk/sites/default/files/downloads/student-enagagement-through-partnership-new.pdf.

BOVILL, C. 2017. A Framework to Explore Roles Within Student-Staff Partnerships in Higher Education: Which Students Are Partners, When, and in What Ways?  International Journal for Students As Partners,  1 (1). https://doi.org/10.15173/ijsap.v1i1.3062, 1.

HEALEY, M., FLINT, A & HARRINGTON, K. 2014. Students as Partners in Learning & Teaching in Higher Education [Online]. York: Higher Education Academy. [Viewed on 1 July 2019] Available from: https://www.heacademy.ac.uk/knowledge-hub/engagement-through-partnership-students-partners-learning-and-teaching-higher.

ArtLab

Artlab

Name/School/ Email address

Tina O’Connell / Art / t.oconnell@reading.ac.uk

Overview

ArtLab is a dedicated art and technology facility that supports Outreach and Widening Participation, by bringing a wide range of children from across different social backgrounds into contact with cutting edge art and technology projects that are co-delivered with Undergraduate Art students (as student co-researchers). Underlying this idea are a set of core educational values concerning the deeper understanding of computing, digital media and new technologies that will form part of a vibrant cultural and economically viable society both today and in the future. The impact on our students derives from the experiences they gain in the delivery of numerous workshops with Primary and Secondary schools as well as a range of other public institutions over the last 4 years (see below). Further to this, the intention is to share knowledge and experience in order to provide a focus for other University of Reading department’s initiatives in this area.

Objectives 

  • To work with undergraduate co-researchers, who learn about a wide range of cutting-edge art and technology projects and co-deliver art-based pedagogic workshops to children from across different social backgrounds.
  • To enable our student co-researchers to understand their potential role as educators, and the value that art brings when combined with developing skills in technology, including offering potential careers within the fastest growing sector in the UK, the creative economy.
  • To encourage these co-researcher students to share their positive experiences with WP schools, helping a shift in mind set from STEM to STEAM (including Arts) and opening up this area to help pupils develop academic and practical skills that are not currently taught in mainstream schools.
  • To introduce pupils from State schools to University contexts through direct experiences with our co-researchers who are frequently students of similar backgrounds.

Context

Within the current educational context (and despite the global acclaim and economic success of the UK Creative Economy) Art and associated disciplines are increasingly de-prioritised or excluded from school curricula; and Art and Science/technology are presented as separate (or mutually exclusive) spheres of study. As such Art and our art students whilst being some of the most sought-after graduates are increasingly positioned as studying a non-essential subject and Arts subjects are increasingly inaccessible to children from State schools.

Implementation

Artlab was set up in 2013 to address these issues and has subsequently organised over 100  workshops, school visits, open days and other events at Primary and Secondary schools and Public institutions such as the Tate and MERL with our student co-researchers.

For instance, in this last academic year (2017-18) Artlab delivered workshops to 29 primary schools involving co-researchers, around 870 children by working closely with 52 teachers and teaching assistants. If we include Reading Scholar, Stellar Projects MERL, Reading International, Tate Exchange and other WP workshops ArtLab co-researchers engaged with over 4,800 potential beneficiaries in the process. One part of a range of the planned activities being coordinated at this time is Tate Exchange, a dynamic public engagement programme that will be central to the public facing activity of Artlab and for educating our students to the nature and reach of social practice. Spearheaded by Artlab, this represents the opportunity for our co-researchers; Reading students, as well as the Stellar Project team (children from Maiden Erleigh School) and our Reading Scholar Students (external A level students) who work with us at Tate on new projects that engage with the idea of ‘production’ – in particular drawing on the ideas and approaches that we have successfully pursued to date – art and technology.

 Impact 

  • ArtLab was nominated and shortlisted for Reading Cultural Award.
    Artlab is a partner of Reading International, which has received financial support from Arts Council of England.
  • ArtLab has helped MERL and READING MUSEUM in its application to ACE to become an NPO, securing £8,000 per year for 3 years delivering Arts MARK for ArtLab.
  • Providing teaching experience/mentoring for our Reading students – or co-researchers, as well as iReading Scholars, by inducting them into use of new technologies.
  • Co-researchers are supported in how to conceive, deliver and part take in workshops, and this has an impact on their future carer choices and their skills base within the creative economy.
  • The ArtLab  Placement is a 20-credit module (as part of ArtMark NPO successful, bid see above) this year there are 3 students on the module delivered by ArtLab. In association with Christ the King Primary School (CKP) Maiden Erlegh East (MEE) Secondary School, MERL and Reading Museum.
  • Summer Workshops over 24 days in June and July each year, ArtLab works with 10 local WP Primary Schools in Berkshire, UK. Including undergraduates, postgraduates, lecturers, teachers and school students working as co-researchers.
  • ArtLab’s PhD Student Outreach the University funds a fees only PhD student

Reflections

There continues to be strong evidence in our evaluation to indicate that the activity has led to deeper forms of engagement by our students, as well as an increase in applications to the University. The evidence can be seen as effective in respect of students across a range of skills, and interests. The success has exceeded our aims in terms of its effects, with much of our success evident in shifts to the success of progression of our students and their impact on cultures and values being developed in primary schools, that that then help shift attitudes in Secondary School pupils. This was an effective approach as pupils resist pressure to drop Art. In this respect this shift in attitudes increases both our students and associated pupils confidence in undertaking degrees in which they can see clear career prospects relating to creative and analytical skills as outlined. For the University, this has explained in part the uptake of some pupils in studying for joint honours degrees, providing further evidence of their value to admissions.

Follow Up 

We will continue our approach which has proven to be successful and attracted very considerable support and recognition including further funding from across the University and from independent organisations such as the Arts Council of England.

TEF

TQ1 LE1 SO1 LE2 S02 LE3 SO3 TQ5

Links
https://readingartlab.com/

Sharing the ‘secrets’: Involving students in the use (and design?) of marking schemes

Rita Balestrini, School of Literature and Languages, r.balestrini@reading.ac.uk

Overview

Between 2016 and 2018, I led a project aiming to enhance the process of assessing foreign language skills in the Department of Modern Languages and European Studies (MLES). The project was supported by the Teaching and Learning Development Fund. Its scope involved two levels of intervention: a pilot within one Part I language module (Beginners Italian Language) and other activities involving colleagues in all language sections and students from each year of study. The project enabled the start of a bank of exemplars for the assessment of a Part I language module; promoted discussion on marking and marking schemes within the department; and made possible a teacher-learner collaborative appraisal of rubrics.

Objectives

  • To enhance Beginners Italian Language students’ understanding of rubrics and their assessment literacy
  • To increase their engagement with the assessment process and their uptake of feedback
  • To engage MLES students as agents of change in the assessment culture of the department
  • To stimulate innovation in the design of rubrics within the MLES Language Team and contribute to develop a shared discourse on assessment criteria and standards informed by the scholarship of assessment

Context

In recent years, there has been an increasing demand to articulate explicitly the standards of assessment and to make them transparent in marking schemes in the form of rubrics, especially in Foreign Languages. It is widely held that the use of rubrics increases the reliability of assessment and fosters autonomy and self-regulation in students. However, it is not uncommon that students do not engage with the feedback that rubrics are supposed to provide. In 2016, the language team of the Department of Modern Languages and European Studies started to work at the standardisation and enhancement of marking schemes used to assess language skills. The aim of this multi-layered project was to make a positive contribution to this process and to pilot a series of activities for the enhancement of foreign language assessment.

Implementation

  • Review of research literature and scholarly articles on the use of standard-based assessment, assessment rubrics, and students-derived marking criteria.
  • Presentation on some of the issues emerged from the review at a School T&L Away Day on assessment attended by the MLES language team (April 2017) and at a meeting of the Language Teaching Community of Practice (November 2017).
  • Organisation of a ‘professional conversation’ on language assessment, evaluation and marking schemes as a peer review activity in the School of Literature and Languages (SLL). The meeting was attended by colleagues from MLES and CQSD (February 2018).
  • 2016-17 – Two groups of students on the Beginners Italian Language module were asked for permission to use exemplars of their written and oral work for pedagogic practice and research. Ten students gave their informed consent.
  • Collection of written and oral work, double-marked by a colleague teaching one of the groups.
  • 2017-2018 – Organization of two two-hour workshops on assessment for a new cohort of students. Aim: To clarify the link between marking criteria, learning outcomes and definitions of standards of achievement of the module. An anonymised selection of the exemplars collected the previous year was used a) ‘to show’ the quality of the standards described in the marking schemes and b) for marking exercises.
  • 2017 – Organisation of three focus groups with students – one for each year of study – to gain insights into their perspectives on the assessment process and understanding of marking criteria. The discussions were recorded and fully transcribed.
  • The transcriptions were analysed by using a discourse analysis framework.
  • Some issues emerged from the analysis: atomistic approach of rubrics; vagueness of the standards; subjectivity of the evaluation; problematic measuring of different aspects of achievement; rating scales anchoring (for a more comprehensive account of the focus groups see the Engage in T&L Blog post Involving students in the appraisal of rubrics for performance-based in Foreign Languages).
  • Developed, in collaboration with three students from the focus groups, a questionnaire on the use of rubrics. The questionnaire was intended to gather future students’ views on marking schemes and their use.

Impact

This multi-layered project contributed to enhance the process of assessing foreign language skills in MLES in different ways.

  • The collection of exemplars for the Beginners Italian Language module proved to be a useful resource that can also be used with future cohorts. The workshops were not attended by all students, but those who did attend engaged in the activities proposed and asked several interesting questions about the standards of achievement described in the marking schemes (e.g. grade definitions; use of terms and phrases).
  • The systematic analysis of the focus groups provided valuable insights into students’ disengagement with marking schemes. It also brought to light some issues that would need to be addressed before designing new rubrics.
  • The literature review provided research and critical perspectives on marking schemes as a tool of evaluation and a tool for learning. It suggested new ways of thinking about marking and rubrics and provided a scholarly basis for potential wider projects. The discussion it stimulated, however different the opinions, was an important starting point for the development of a shared discourse on assessment.

Reflections

The fuzziness of marking students’ complex performance cannot be overcome by simply linking numerical marks to qualitative standard descriptors. As mentioned in a HEA document, even the most detailed rubrics cannot catch all the aspects of ‘quality’ (HEA, 2012) and standards can be better communicated by discussing exemplars. There is also an issue with fixing the boundaries between grades on a linear scale (Sadler, 2013) and the fact that, as Race warns, the dialogue between learners and assessors (Race, HEA) can easily be broken down by the evaluative terms typically used to pin down different standards of achievement. Despite all these pitfalls, in the current HE context, rubrics, if constructed thoughtfully and involving all stakeholders, can benefit learning and teaching.

By offering opportunities to discuss criteria and standards with students, rubrics can help to build a common understanding of how marks are assigned and so foster students’ literacy, especially if their use is supported by relevant exemplars.

The belief that rubrics need to be standardised across modules, levels and years of study makes designing rubrics particularly difficult for ‘foreign languages’. Cultural changes require time and the involvement of all stakeholders, especially where the changes concern key issues that are difficult to address without a shared view on language, language learning and assessment. A thorough discussion of rubrics can provide chances to share ideas on marking, assessment and language development not only between students and staff but also within a team of assessors.

I have tried to engage students in the appraisal of rubrics and to avoid a market research approach to focus groups. It is clear that, if we are committed to make any assessment experience a learning experience and to avoid the potential uneasiness that rubrics can cause students, we need to explore new ways of defining the standards of achievement in foreign languages. Establishing pedagogical partnerships with students seems a good way to start.

Follow up

I will encourage a differentiation of rubrics based on level of language proficiency and a collection of exemplars for other language modules. The natural follow up to this project would be to continue enhancing the rubrics used for evaluation and feedback in languages in the light of the analysis of the focus group discussions and the review of the literature on assessment, ideally with the collaboration of students. Possible connections between the marking schemes used to assess language modules and cultural modules will be explored.

References

HEA, 2012. A Marked Improvement. Transforming assessment in HE. York: Higher Education Academy.

Race, P. Using feedback to help students to learn [online] Available at https://www.heacademy.ac.uk/knowledge-hub/using-feedback-help-students-learn   [accessed on 15/8/2018]

Sadler, D. R. 2013. The futility of attempting to codify academic achievement standards. Higher Education 67 (3): 273-288.

 

Links to related posts

‘How did I do?’ Finding new ways to describe the standards of foreign language performance. A follow-up project on the redesign of two marking schemes (DLC)

Working in partnership with our lecturers to redesign language marking schemes 

Involving students in the appraisal of rubrics for performance-based assessment in Foreign Languages By Dott. Rita Balestrini