Daniela Standen, School Director of Teaching and Learning, ISLI Alison Nicholson, Honorary Fellow, UoR
In summer 2018-19 Italian and French in Institution-wide Language Programme, piloted paired Oral exams. The impact of the change is explored below. Although discussed in the context of language assessment, the drivers for change, challenges and outcomes are relevant to any discipline intending to introduce more authentic and collaborative tasks in their assessment mix. Group assessments constitute around 4% of the University Assessment types (EMA data, academic year 2019-20).
- improve constructive alignment between the learning outcomes, the teaching methodology and the assessment process
- for students to be more relaxed and produce more authentic and spontaneous language
- make the assessment process more efficient, with the aim to reduce teacher workload
IWLP provides credit-bearing language learning opportunities for students across the University. Around 1,000 students learn a language with IWLP at Reading.
The learning outcomes of the modules talk about the ability to communicate in the language. The teaching methodology employed favours student–student interaction and collaboration. In class, students work mostly in pairs or small groups. The exam format, on the other hand, was structured so that a student would interact with the teacher.
The exam was often the first time students would have spoken one-to-one with the teacher. The change in interaction pattern could be intimidating and tended to produce stilted Q&A sessions or interrogations, not communication.
Who was affected by the change?
4 Proficiency Levels
- The interlocution pattern changed from teacher-student to student-student, reflecting the normal pattern of in-class interaction
- The marking criteria changed, so that quality of interaction was better defined and carried higher weight
- The marking process changed, teachers as well as students were paired. Instead of the examiner re-listening to all the oral exams in order to award a mark, the exams were double staffed. One teacher concentrated on running the exam and marking using holistic marking criteria and the second teacher listened and marked using analytic rating scales
- Students to be more relaxed and produce more authentic and spontaneous language
- Students to student interaction creates a more relaxed atmosphere
- Students take longer speaking turns
- Students use more features of interaction
(Hardi Prasetyo, 2018)
- For there to be perceived issues of validity and fairness around ‘interlocutor effects’ i.e. how does the competence of the person I am speaking to affect my outcomes. (Galaczi & French, 2011)
- Homogeneous pairings, through class observation
- Include monologic and dialogic assessment tasks
- Planned teacher intervention
- Inclusion of communicative and linguistic marking criteria
- Pairing teachers as well as students, for more robust moderation
Methods of evaluation
Questionnaires were sent to 32 students who had experienced the previous exam format to enable comparison. Response rate was 30%, 70% from students of Italian. Responses were consistent across the two languages.
8 Teachers provided verbal or written feedback.
Students’ Questionnaire Results
Overall students’ feedback was positive. Students recognised closer alignment between teaching and assessment, and that talking to another student was more natural. They also reported increased opportunities to practise and felt well prepared.
However, they did not feel that the new format improved their opportunity to demonstrate their learning or speaking to a student more relaxing. The qualitative feedback tells us that this is due to anxieties around pairings.
- Language production was more spontaneous and authentic. One teacher commented ‘it was a much more authentic situation and students really helped each other to communicate’
- Marking changed from a focus on listening for errors towards rewarding successful communication
- Workload decreased by up to 30%, for the average student cohort and peaks and troughs of work were better distributed
Overall, the impact on both teachers and students was positive. Student reported that they were well briefed and had greater opportunities to practise before the exam. Teachers reported a positive impact on workloads and on the students’ ability to demonstrate they were able to communicate in the language.
However, this was not reflected in the students’ feedback. There is a clear discrepancy in the teachers and students’ perception of how the new format allows students to showcase learning.
Despite mitigating action being taken, students also reported anxiety around ‘interlocutor effect’. Race (2014) tells us that even when universities have put all possible measures in place to make assessment fair they often fail to communicate this appropriately to students. The next steps should therefore focus on engaging students to bridge this perception gap.
Follow up was planned for the 2019-20 academic cycle but could not take place due to the COVID-19 pandemic.
Galaczi & French, in Taylor, L. (ed.), (2011). Examining Speaking: Research and practice in assessing second language speaking. Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo, Dehli, Tokyo, Mexico City: CUP.
Fulcher, G. (2003). Testing Second Language Speaking. Ediburgh: Pearson.
Hardi Prasetyo, A. (2018). Paired Oral Tests: A literature review. LLT Journal: A Journal on Language and Language Teaching, 21(Suppl.), 105-110.
Race, P. (2014) Making Learning happen (3rd ed.), Los Angeles; London: Sage
Race, P. (2015) The lecturer’s toolkit : a practical guide to assessment, learning and teaching (4th ed.), London ; New York, NY : Routledge, Taylor & Francis Group