Making Word and Powerpoint accessible: By Professor Richard Mitchell and Dr Laura Bennett

Preamble

Last year the University agreed a new Policy on Inclusive Practice in T&L, which is available at: http://www.reading.ac.uk/web/files/qualitysupport/Policy_on_Inclusive_Practice_in_Teaching_and_Learn.pdf. The implementation of this policy is being overseen by a working group chaired by Clare Furneaux, and one of its four subgroups, on Staff Training, has been chaired by us both. One aspect of the policy is making documents and presentations inclusive, and the purpose of this blog is to discuss our experiences of using Word and Powerpoint in the preparation and delivery of our teaching materials.

This blog should be read in conjunction with the top tips on accessibility document first sent round in the summer of 2017, and recently updated. More information is also available on the Engaging Everyone web site, and in various links here.

By following these tips, you can make it easier for ALL to follow your documents and presentations, but it is especially useful for those who use screen readers, where a properly accessible document can be navigated more easily.

In order to assess whether your document is accessible, in Word or Powerpoint, on the Info tab, under Check for Issues, you can check Accessibility, and suggestions come up of changes to make. Note you may need to ensure that you have an up to date version of the file otherwise you get the unhelpful message : “Unable to run the Accessibility Checker”.

From our experiences, and those of others, some of the suggestions made by the accessibility checker are not appropriate, so you should use your judgement – in the same way that you don’t just use the similarity percentage in TurnItin in assessing plagiarism.

Unfortunately, we have found that making our files accessible is not as straightforward as one would like, hence this blog. It covers specific issues in Word and Powerpoint, and then topics relevant to both.

Word

The key points as regards Word are to use appropriate fonts of a suitable size and to ensure suitable navigation. This is generally straightforward: you use the styles, such as Title, Normal, Heading 1, etc. So for each of these you define the appropriate font (a sans serif font such as Arial, Calibri or even Effra the University corporate font), size (at least 12 point) and spacing: 1.5 is recommended. Guidance on using styles is available here.

I, Richard, used to use such styles, but stopped doing when I found that importing text from another Word document which uses different styles, can ‘upset’ the formatting of the whole document. Now that I appreciate why styles are important, I am using them again. As a tip, to obviate this import ‘feature’ in Word, I have defined a template for my teaching material – you could consider having such an individual template or perhaps have a School or Departmental template.

Laura found that developing a template saved much time. One particularly frustrating feature of Word is its tendency to identify bullet points as headers, and the use of a template is certainly not a panacea, but it does help.  Another tip is to ensure that the first few paragraphs of a document are correctly formatted and then to use format painter to make the rest of the document consistent.  Doubleclicking on the paintbrush button for format painter will allow you to copy that format onto mulitiple paragraphs. Click on the paintbrush again to cancel.

On Powerpoint

Some of this is reasonably straightforward, but we both found this can take much time.

It is recommended (especially for people with dyslexia) that the background colour is non white: ‘Cream’ is suggested, though it is not usually defined what that is! I, Richard, defined cream with RGB components 255, 240, 200, which looks fine on screen but seems white in some lecture theatres. Recently I discovered an example template where the RGB is 252, 230, 172 – quite close; another site suggests 255,253,208.  To set the background, go to the Slide Master View, select the Slide Master, right click on the screen, select Format Background and set the colour.

Having non white background can be an issue re images if they themselves have a white background. Powerpoint can allow the background of an image to be identified, and set as transparent to solve this. However, as is typical for the product, this works only some of the time.

The Slide style sheets can be used to set suitable fonts (again sans serif) and sizes (at least 24) as well as the background colour.

If a slide just has text in a textbox, then by using these styles, little more is needed.

If however your slides have multiple objects, then more work is needed. For instance, the accessibility checker asks that you check the order in which the items are read – which a screen reader uses.

To do this, you go to the Home tab, and select Arrange -> Selection Pane. You get a list of all items on the slide and can adjust their order: you select one and then use the up or down arrows.

We found, and this was not immediately obvious (or logical), that these have to be done in reverse order, so Title is at the bottom and, we guess, any footer information is last.

You are also warned when a presentation does not have (or at least the accessibility checker thinks it does not have) a title on each slide. It also warns about duplicate slides with the same title. There may be good reasons for having the same title, as a particular topic may be discussed on many slides. You can appease the checker by having headings such as “Topic(1)”, “Topic(2)”, etc., but we doubt that this is helpful. You should use your judgement.

The checker expects the columns of tables to have a label for each column. This may not be appropriate. For instance, Richard sometimes uses a table just as a way of having a rectangular grid with elements in it. So this is an unhelpful warning – which annoying you can’t turn off.

Last year Richard attended an evening lecture in which the slides had many images which the speaker used as prompts to provide useful information. Most of the information in the lecture was in what the speaker said rather than in the slides. It was an engaging lecture perhaps precisely because the speaker was not reading from the slides (where the advice is to speak what is said on the slides). However, this is problematic from an accessibility point of view. One solution to this, which Richard has tried out, is to make use of the Notes section in Powerpoint – which screen readers can access. He is still evaluating this. Another of course is to have lecture capture …

On Images, Equations and hyperlinks

These apply to both Word and Powerpoint, and so are covered here: it is important to add some more information. For images and in principle equations you add ‘Alt Text’, by right clicking on the item, selecting ‘formatting’ and a dialog allows Alternative Text to be added – you can enter a Title and/or a Description. For a hyperlink, you edit the link and add a ‘Screen Tip’.

We have found that the accessibility checker is happy if you put something there, but really the text should be meaningful. For suitable guidance on this, see for instance University of Leicester writing effective ALT text

Laura has found that she makes constant use of the Alt Text function, described below, in describing the images in her Powerpoint slides.

If you have an image say, you might want to add text (perhaps in a larger font than was there in the original image) and to provide explanation: it then makes sense to create a group comprising the image, text and perhaps arrows. Annoyingly, the accessibility checker seems to want Alt Text for the whole group and the image (and the arrows). Again, you should use your judgement.

Equations are themselves difficult, as the symbols used, layout, etc., are crucial, so a text description may not be easy or useful. Currently we are seeking guidance on these, so no more is said here.

Summary

Overall, we have found that whilst it is reasonably straightforward to make documents and presentations accessible, it does take time, so don’t do it at the last minute. You do not, however, have to make them 100% accessible (as assessed by the built in accessibility checker). You should use your judgement, so don’t be daunted or do nothing at all if your first few accessibility checks give rise to rows of suggestions. It is important that we all produce inclusive documents and presentations as far as possible, and Laura has also found that students are appreciative if she makes it clear that they should let her know if something is not working for them so she can fix it.

When producing new documents/presentations, it is much better to set them up to make them as accessible as possible and then add the content. Both of us have found that we learned very quickly how to make materials accessible as we went along very quickly thus saving time at the checking stage.  It’s also worth remembering that unless you change your materials completely every year, the amount of time you will need to spend on this will decline dramatically after the first year.

 

Curriculum review in practice Aligning to the Curriculum Framework – first steps started By: Jeanne-Louise Moys, Rob Banham, James Lloyd

We’re all hearing about the University’s new Curriculum Framework in meetings and training. But how do we start to put this process of alignment into action for individual programmes? Three Typography & Graphic Communication (T&GC) colleagues decided to thrash out a clearer strategy for achieving this objective for our BA Graphic Communication programme.

Background

In T&GC, we’re currently working on ways to develop more sustainable assessment and feedback practices for increasing numbers of students. In autumn, Jeanne-Louise Moys met with Deb Heighes and Kamilah Jooganah from the University’s Centre for Quality Support and Development (CQSD) to discuss assessment strategies. Deb and Kamilah suggested looking at Programme-Level Assessment (see Hudson, 2010) as a first step to mapping out our various ideas and concerns and helping us evaluate what to keep, modify or discard. The programme-level rationale is a key priority for the Curriculum Framework. Reading colleagues have, for example, explored its application through well-regarded projects like TESTA (http://testa.ac.uk/) during Tansy Jessop’s keynote and workshop at the spring T&L Curriculum Framework Conference. We used Deb and Kamilah’s suggestion as an opportunity to dive into the Curriculum Framework and explore how our BA Graphic Communication degree might align to the framework.

Method and participants

In T&GC, in addition to module convenors, we have a Year Tutor for each year of study who helps ensure good practice and organisation across modules. Our three year tutors (James Lloyd – Part 1, Jeanne-Louise Moys – Part 2, Rob Banham – Part 3 and our Department Director of Teaching and Learning) had a mini away day to workshop ideas for assessment and feedback. We spent just over half the day focusing on the activity presented here, the rest of the day involved discussing other aspects of assessment and feedback.

Our goal was to identify priorities at a programme level so that we can present our colleagues and our Board of Undergraduate Studies with a strategy for our response to the Curriculum Framework. We decided to brainstorm an initial strategy in a small group to ensure that when we ask all our Teaching & Learning staff to attend a Curriculum Framework session, we are able to make effective use of staff time. This is particularly important for a small department with intensive teaching schedules (due to the practical nature of many of our modules) and a high number of part time and sessional staff.

Programme mapping activities

We started our workshop by looking at our existing programme description outcomes. We agreed these were an important starting point. We rewrote these outcomes on large sheets of paper which we pinned up, so we could look and review these as a team.

We identified the outcomes we felt were becoming out-dated and should be omitted or modified. We also noted areas of our degree and teaching practices that we felt weren’t sufficiently addressed in the current programme description despite being recognised areas of good practice. We agreed that some of our T&L practices make a particular and distinct contribution to student learning and need to be recognised more explicitly in the programme description (rather than only in individual module descriptions). Examples include our real jobs scheme (http://typography.network/real-jobs-scheme/) and inclusive design activities (see the Breaking down Barriers project blog – http://typography.network/real-jobs-scheme/) that have become more developed within our curriculum.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Next, we looked at our Art and Design subject benchmark statement to ensure our discussion and review is appropriately aligned at a wider disciplinary level. We noted that the benchmark statement puts more emphasis on attributes such as creativity and students’ knowledge of ethics for professional practice than our existing descriptions. We used this to review our programme outcomes and identify outcomes that, for our discipline, need to be more explicit. Our discussion also included some critiques of the existing outcomes that we felt were too generic and don’t sufficiently highlight the typographic dimensions that make our degree distinct from other design programmes.


 

 

 

 

 

Having mapped out all the key discipline-specific content, we colour coded the four areas of the graduate attributes in the Curriculum Framework and began to apply the colour coding to our list of outcomes. This provided a useful way of restructuring outcomes and identifying repetition within the outcomes (where, for example, practical skills and transferable skills overlap in their remit). Using this new structure, we reviewed our outcomes and fine-tuned the wording for each one through collegial debate. Our lively and critical discussion ensured we had a set of outcomes that we felt were attuned appropriately to our programme and the graduate attributes in the Curriculum Framework.

Outcomes and next steps

This has provided us with a revised set of programme outcomes, which we will present to our colleagues for discussion before our Board of Studies reviews these more formally. Our intention is to ask module convenors to use these categories, which are now mapped to the graduate attributes, to review individual modules. This will provide us with clear information that we can evaluate to ensure that we are sufficiently addressing each attribute of the Graduate Framework across the degree, to track these through the three years of the programme, and identify areas where we are over-teaching.

This activity was helpful to identify which areas we need to focus on and address more explicitly and which areas we feel confident that we are already aligning to well. In particular, we are aware that the Academic Principles “Diverse and inclusive” and “Global” are less effectively embedded in our undergraduate programme than they are in our postgraduate programmes and research. These are the areas of the Curriculum Framework that we will be prioritising and will be asking module convenors to consider in the most detail.

Our current Partnerships in Learning and Teaching (PLanT – http://www.reading.ac.uk/cqsd-PLanTProjectsScheme.aspx) project ‘I am, we are … different by design’ will also inform the ways in which, moving forward, we align individual modules and our teaching practices with the Curriculum Framework. Students working on this project are conducting research and other activities to help identify student-led recommendations about how we can nurture “Diversity and inclusion” and “Global” principles in the Department. They are also contributing to the development of a new module “Design for change” that will embed new opportunities for students to engage with a more diverse and global range of design practices within the BA curriculum.

Reflections on process

It was rewarding to have a teamwork day. We particularly enjoyed putting some of the brainstorming and information organisation processes we teach our students into action. Involving our Year Tutors means that we can begin looking at some of the details of our responses to the Curriculum Framework across the degree in a systematic way, rather than adopting well-intentioned but piecemeal approaches. Moving forward, this should help us achieve a good level of cohesion across the programme and avoid too much ‘module drift’.

References

Hudson, J. (2010). Programme-Level Assessment: a review of selected material. Published online: http://www.pass.brad.ac.uk/wp3litreview.pdf.

For those already involved or about to embark on programme review, the ‘Curriculum Review in Practice’ event on Monday 30th April will be an opportunity for the Typography and Graphic Communications team, alongside other case studies from across the University, to showcase their journey through curriculum review and answer some of those more pertinent questions of what, how, and where to start.

This session is open to all staff and lunch will be provided. To book onto ‘Curriculum Review in Practice’ please click here.

Curriculum Framework Conference 2018

On 31 January the Meadow Suite in Park House was a-buzz with an air of anticipation as attendees at the Curriculum Framework Conference munched on Breakfast Baguettes and gulped down copious quantities of fresh coffee and tea.

Following a generous welcome and the obligatory identification of emergency exits, the morning’s session commenced with a thoroughly engaging and thought-provoking key-note address from Professor Tansy Jessop from Southampton Solent University. Delegates were immersed into a lively discussion of the changing university environment and the impact this has on all aspects of teaching and learning. The key take-away seemed to be that transforming the assessment experience for students requires a programme level approach, and is a critical component in the move from a knowledge-transmission model of higher education, to one of social constructivism. Following an opportunity for collegial networking over a further cuppa,

a choice of parallel sessions offered conference attendees a range of interactive workshops pertinent to the Curriculum Framework ; this blog post will focus on the TESTA Masterclass workshop delivered by Tansy.

The Masterclass introduced participants to the three components of TESTA methodology: 1) Assessment Mapping, 2) a Student Assessment Experience Questionnaire, 3) Student Focus Groups.

The Psychology Department volunteered their level-one compulsory modules as an impromptu case-study to illustrate the process of assessment mapping. This was seen as a brave move by at least one of the participants!

The resulting quantification of both the total number and the spread of assessment types demonstrated the value in undertaking this type of analysis. Participants also had a chance to review the current version of the Assessment Experience Questionnaire, and to review a transcript taken from a student focus group. Tansy skilfully showed the importance of using all three tools to gain a holistic understanding of the assessment environment as a precursor to full-scale review.

TESTA was born out of an HEA funded project; resources, tools and case studies are available online here.

Parallel sessions in both the morning and afternoon also engaged participants in: rapid Curriculum Design, Fostering Belonging in Culturally Diverse Cohorts, Inclusive Practice, Engaging students in curriculum review, and embedding Research & Enquiry, Assessment Literacy, and Employability into the curriculum

From the conference it was evident that curriculum review in light of the Curriculum Framework is gathering momentum. In response to participant feedback, the Curriculum Framework team is planning a follow-up session focussing on what curriculum review looks like in practice. This session will be designed to answer the more practical questions of what, how, and where do I start.

Watch this space!

 

 

How pre-sessional English has develop the use of Turnitin, submission, marking and feedback to support students’ essay and exam writing.

Jonathan Smith is the School Director for Technology Enhanced Learning in ISLI (International Study and Language Institute). He is also a PSE (Pre-sessional English) Course Director and teacher of English.

The Pre-sessional English programme accepts around 600 to 800 students each year. Their students develop English skills in academic writing, reading, speaking and listening.

In the area of academic writing Jonathan Smith and his team have been exploring the use of Turnitin (Tii) GradeMark to facilitate electronic marking and feedback via:

. E-submission of written essays.

. E-marking and e-feedback via GradeMark using QuickMarks and text comments.

. Student engagement with feedback in subsequent production of written work.

About five or six years ago, before the use of GradeMark was adopted in the university, a group of pre-sessional staff attended a conference in Southampton in which colleagues of other universities presented how they were using GradeMark. It seemed a tool that could not only save time producing feedback but produce feedback of a more consistent quality. A couple of years later PSE started exploring its use with our cohorts of English academic writing students.

Listen to Jonathan’s experience on how he got involved with electronic submission, marking and feedback via Tii in this podcast.

Jonathan Smith, provides all PSE teachers with a one-hour workshop on how to use Turnitin and Grademark. Part of the training involves the use of the PSE ‘QuickMarks’ for e-feedback. These QuickMarks focus on common student errors with explanations and links to relevant sources – and can be used to provide in-text feedback. ‘QuickMarks’ are based not only on common grammar and lexical errors but also on the complexity of the language structures used and coherence and cohesion in the texts. Students are also assessed on content, use of references and other areas of relevance to academic essay writing.

After the training session, tutors set up submission points for formative work, in this manner students grow accustomed to submit work, access feedback, see and compare their own progress.

Students receive feedback almost immediately and they can work on the feedback either to bring it to the next class or towards their next assignments.

 

 

 

 

 

 

 

From the teachers’ perspective it was noticed that it was quicker to note common student errors in-text using QuickMarks. It was possible to see colleagues’ feedback comments which facilitated new tutors becoming familiar with marking and feedback across the cohorts.


 

 

 

 

 

 

One of the big advantages is that Turnitin is a one stop shop for both checking similarity and producing and receiving feedback. Students upload their essays, they can see their similarity reports and have the opportunity to take action and re-submit. There are a few technical issues around doing that, but the pre-sessional programme is committed to students seeing their similarity reports and using them to get a better idea of the quality and acceptability of their work.

Visit the EMA programme site to find out more case studies and updates http://www.reading.ac.uk/internal/ema/ema-news.aspx

THE BENEFITS OF NEW MARKS AVAILABILITY ON RISIS FOR PERSONAL TUTORING: By Dr Madeline Davies, EMA Academic, and Kat Lee (External)

The EMA Programme has delivered a new function within RISIS that allows colleagues to see their students’ sub modular marks on the Tutor Card. We have all had access to the Tutor Card for some time and it has provided an invaluable snapshot of a student’s degree history, particularly useful for writing references and for monitoring attendance. However, in terms of sub modular marks, it has always functioned retrospectively: prior to the start of the new academic year, our students’ updated assessment records from the previous session are available on the Card but they have never been available during the academic session.

The sub modular mark screens accessible via the Tutor Card mean that we will no longer have to wait until the end of the academic year to have access to our students’ assessment information and this creates a range of benefits for personal tutors in particular. Easy access to the sub modular marks will provide an early indication of any problems that our students may be having and this will allow us to address these issues in a timely manner.

The information becoming available is significantly more extensive than a list of marks alone: a series of codes is used to flag up, for example, a result involving academic misconduct or extenuating circumstances requests (scroll down the page to translate the codes via the key), and a hover function under ‘Notes’ provides submission details so that personal tutors can tell when a ‘late’ penalty has been applied or when there has been another change to a mark (see image). Any one of these situations would require personal tutor intervention but, until now, this information has not been available to us unless our tutees have chosen to disclose it in personal tutor meetings.

The new screens are, then, particularly significant for our work as personal tutors: the wealth of information made available gives tutors the means to identify and support students who are struggling before they find themselves in crisis. Proactive and early intervention is always more effective than reactive response, and the additional access to information during the year that has been made available by EMA allows us to ensure that no student falls behind without us realising it.

The new screens also connect with the University’s inclusivity agenda in that students coming to us from non-traditional educational backgrounds can need extra support in their first months with us. The screens will alert us to situations where Study Advice, or Counselling and Wellbeing, need to be consulted.

In addition, students who may be of concern in academic engagement and/or Fitness to Study processes, can be checked at every assessment point, and this will allow Senior Tutors and SDTLs the opportunity to assess a student’s ability to cope with the pressure of assessment deadlines. This in turn facilitates early intervention in problematic cases and provides an easily available record of performance in cases requiring escalation.

The role of the personal tutor primarily involves offering tutees academic advice in response to their marks, feedback and more general concerns. The addition to the Tutor Card of sub modular marks and notes during the course of the year underpins this work and creates the opportunity for meaningful discussions with our tutees. New access to this information allows us to respond to student issues ‘in real time’, thus allowing personal tutors to act as effective academic advisors, and to engage in crucial developmental dialogue with the students in our care.

To view a screencast that shows you how to navigate the sub modular mark screens on the tutor card, click https://www.screencast.com/t/sKCH4czjJ

To view a screencast that shows you how to navigate the Module Convenor Screens that are now also live, click http://www.screencast.com/t/MjCxE6UxfM

For further information on the EMA Programme, please click http://www.reading.ac.uk/ema/

Exploring different types of video cameras for use in practical classes and outreach By Dr Philippa Cranwell, Mrs Susan Mayes and Dr Jenny Eyley

A successful TLDF application in April provided us with funds to explore the use of different lapel-mounted cameras to look into student-student and student-staff interactions within a practical laboratory environment. This work is still ongoing, but we have learnt some interesting lessons about buying lapel-mounted cameras along the way, and have also used them successfully in outreach initiatives.

Cameras trialled

In total, four types of camera were trialled that cost between £49.95 and £120 (RRP; correct as of August 2017). With all the cameras we purchased additional memory cards, although some were supplied with small memory cards.

The first three were of a similar design; a camera, shaped like a USB stick, with a clip on the back to allow it to be mounted on a pocket. The cameras trialled were: the Veho VCC-003-MUVI-BLK MUVI Micro Digital Camcorder (RRP £39.95); the Conbrov® Spy Cameras DV12 720P (RRP £59.99); and the Conbrov® WF92 1080P (RRP £69.99). All arrived quickly and were very easy to set-up, although none had a screen so it was not possible to see the recording without putting the images onto a computer. We quickly realised that mounting these cameras on a lab-coat pocket was not satisfactory because they were quite weighty and fell forwards, resulting in a great deal of footage of the floor. A body harness was available for the Veho camera (RRP £39.95), which would have addressed this problem, but it was decided not to continue with this style of camera due to the lack of screen resulting in no real-time feedback of recording quality.

L to R: Veho VCC-003-MUVI-BLK MUVI Micro Digital Camcorder; Conbrov® Spy Cameras DV12 720P; Conbrov® WF92 1080P

The camera that was most suitable for our needs was the Apeman Underwater Action Camera Wi-Fi 1080P 14MP Full HD Action Cam Sports Camera 2.0 (RRP £119.90). This camera came with 2 batteries, each recording up to 90 minutes of footage. We purchased micro SD cards separately; cards over 32MB are not supported by this camera. In addition to the camera we purchased a Togetherone Essential Accessories Bundle Kit (RRP £59.99) that provided a large number of additional items to mount the camera as required. Some of the most useful items in the pack included a “selfie-stick” that was used by school children on an outreach visit, a body harness and a head-mounted harness. The camera itself arrived in a plastic container, which is waterproof and protects the camera, but when recording dialogue it is less useful as the sound is muffled. However, there are alternative holders so the camera can be mounted on the body or head in an open case allowing clear dialogue to be captured.

The Apeman Underwater Action Camera Wi-Fi 1080P 14MP Full HD Action Cam Sports Camera 2.0 and the Togetherone Essential Accessories Bundle Kit

Use in outreach

The cameras were successfully used by secondary school students who took part in a trip to Thames Water sewage treatment works. This trip was organised by the chemistry outreach team as part of the Chemistry for All project, in order to show students how chemistry is used in all parts of their daily life. The number of students able to have this experience was limited by the space on the observation platforms, therefore the students used the cameras to film their experience and produce a video diary of the day. The videos will be edited and shared with other students on return to school, widening the reach of the activity beyond the students who attended. The teacher who was in attendance with the students commented that “having the Apeman cameras during the tour meant they were more excited and enjoyed it more”

 

        

Photographs taken by the students at the Thames Water sewage treatment works

Outlook

The Apeman cameras have been a useful addition to the Department, particularly for outreach purposes. We will continue to use the cameras for outreach, and also to undertake some observations of students undertaking practical work for the TLDF-funded project and another internationalisation project in conjunction with ISLI.

 

 

Launching the FLAIR CPD scheme at the University of Reading Malaysia – By Dr Eileen Hyder

One of the highlights of 2017 for me was launching the FLAIR CPD scheme at the University of Reading Malaysia. A substantial part of my role involves talking to colleagues about their work to help them to develop ideas for their FLAIR CPD application. These conversations give me wonderful snapshots into the fantastic work happening across our institution. This is such a privilege and is probably what I love most about my work. I knew I would find it fascinating to talk to colleagues at UoRM and to learn more about the work they are doing in such a different context. However, the conversations I had there were not just fascinating but a real eye-opener for me.

One aspect of an application for Associate Fellowship or Fellowship is to write 600 words on designing and planning learning. Because the sessions/modules delivered in Malaysia have often been designed at Reading, this raised questions about whether colleagues at UoRM would be able to demonstrate this type of activity. However, the discussions that took place in the workshops threw out many examples that quickly showed us that any concerns we had were misplaced.

One example that sticks in mind came from a colleague in Psychology. He explained to us that some Psychology students at Reading will have studied the subject at school and he added that, even those who haven’t, will more than likely be aware of some key figures and concepts included in the university curriculum. However, because Psychology does not feature on the school curriculum in Malaysia and because awareness of figures like Freud or concepts like psychoanalysis cannot be taken for granted, he needs to reflect carefully on what has been designed at Reading UK to ensure it can be delivered effectively at UoRM.

Another colleague explained to us that modules at UoR UK are sometimes designed around the research interests of staff. In a case like this, the module might be taught by a team of as many as eight colleagues, with each person delivering a session built around their area of expertise. However, the same module will be delivered by only one tutor at UoRM. While I have had experience of delivering sessions designed by someone else, I have never been in a position like this. I knew I would be conscious of the limits of my expertise compared to the experts at Reading UK and be anxious about whether I would be able to provide an equally high quality learning experience for my students. I felt huge respect for the way colleagues at UoRM take responsibility for designing sessions that do this.

Through these conversations and others we quickly came to realise that we had been naive in thinking it might be difficult for colleagues at UoRM to write about designing/planning learning. We realised that far from being passive deliverers of material designed at Reading UK, they work very hard to translate and customise learning for the UoRM context. This means exercising professional judgement and skills to make learning relevant and accessible to their students.

One of the things I love about my role is how it enriches my own understanding of teaching and learning. Working with colleagues at UoRM certainly broadened my understanding of what counts as designing/planning learning. The Curriculum Framework is leading to exciting discussions about how our curricula are designed. My experiences at UoRM have led me to think that we should involve as wide a range of colleagues as possible in these discussions. Just because someone might not have had autonomy in the original design of a module does not mean that they have no agency. The Curriculum Framework is an important catalyst for discussions around curriculum design and around the global relevance of our programmes/modules. Involving colleagues who take something designed in one context and deliver it in another could add richness and value to these discussions.

Forecasting, Feedback and Self-reflection by Dr Peter Inness

Overview:

Each year a group of part 2 students from Meteorology make their way across campus to the Minghella Building to film weather forecasts in the professional “green screen” studio. As well as improving their forecasting ability this module also helps students to improve their presentation skills – a key employability attribute in many careers.

Objectives:

During the module students will;

  • make short video weather forecasts in a professional studio
  • receive feedback on performance in order to improve on the quality of the work
  • give peer feedback to fellow students in order to develop this useful life skill
  • reflect on their performance and consider how they can use the feedback to improve future performances.

Context:

Presentation skills are a crucial aspect of many jobs, whether it be in front of a camera or face to face with an audience. Lecturers in Meteorology may not always be the best people to coach these skills so we draw on experience in a School where performance and presentation is at the heart of everything they do.
Students spend 4 sessions in the TV studio, working up to the filming of a “live” TV weather forecast. After each rehearsal, students receive detailed feedback on their performance from staff and also from their fellow students. Crucially they are also asked to reflect on their own performance and how they might improve it. This self- reflection aspect is something we would like to encourage across the Meteorology department as it is a skill which perhaps doesn’t come naturally to a scientific discipline in the same way as it does in a performance related discipline such as film and theatre.

Impact:

Students are very appreciative of the high level of feedback on performance in this module, as evidenced in module evaluation questionnaires. The feedback also has a massive impact on improving the students’ performances across the module, resulting in some near professional standard performances by the end.

It is obvious that the encouragement to reflect and take on board feedback is a major driver of improved student performance in this module.

Reflections:

Working in an environment in which feedback and self-reflection are built into the activities has made me as a module convenor in a science department realise that this is something we can use more effectively across many of our other modules, not just those which involve presentation.

Self-reflection and peer feedback have a clear impact on performance in this module and we need to find ways to incorporate more of these activities into the rest of our taught modules.

I am now actively looking at ways that we can make reflection an integral part of how our students approach their learning.

 

Supporting Diversity through Targeted Skills Development: Helping Students to Speak a New Language by Alison Fenner SFHEA (Institution Wide Language Programme, ISLI)

Context

As the student population becomes increasingly international, the IWLP language class cohorts are becoming ever more diverse. It has become evident to tutors in IWLP (as throughout the University) that the linguistic, educational and cultural aspects of a student’s background can play an important role in their language acquisition, often helping some aspects while hindering others. In language learning, they may experience varying success in the development of the four language skills of listening, reading, speaking and writing, performing well in some skills while experiencing difficulty in others.

The Language Learning Advisor scheme and the development of a PLanT project

With this in mind, in the Autumn Term of 2016 I successfully applied for PLanT (Partnerships in Learning & Teaching) funding to provide targeted support sessions in oral work and pronunciation for those students who found these areas more challenging. The aim of the project was to improve their performance, motivation and, crucially, confidence. PLanT funding is awarded by CQSD and RUSU for projects involving both staff and students, and I invited three Language Learning Advisors (two undergraduates from the Department of Modern Languages and European Studies and one multi-lingual undergraduate from Psychology and Clinical Language Sciences) to deliver the sessions. Since these sessions had a particular focus, they were delivered on a small-group basis rather than the one-to-one basis more usual for Language Learning Advisors. They were delivered to students studying German at beginner level.

The three Language Learning Advisors were part of the peer-to-peer Language Learning Advisors scheme, which I have run since 2012. In the scheme, I train students who are successful language learners (usually languages undergraduates in the DMLES or students from the higher stages of IWLP) to advise their peers in DMLES and IWLP on the acquisition of effective language learning strategies, including the development of particular language skills and independent learning. The Advisors help students to develop effective self-evaluation, to reflect on their learning styles and to set achievable long-term and short-term goals in their language learning. Students also benefit from the support and encouragement offered by their Advisors in the continued dialogue of follow-up sessions in which progress is monitored.

Before the PLanT-funded sessions began, I and the Advisors discussed the needs and strategies involved. I monitored the progress of the sessions, and at the end of the academic year the Advisors submitted records of activities completed and materials used, and reflections on their experience. Two Advisors worked with me on preparing a presentation for the LTRF (Learning and Teaching Research Forum) of the International Study and Language Institute in June; the third had already left the University by then but helpfully recorded her contribution on video. The presentation met with a positive response and was a valuable experience for the Advisors, enabling us to inform a wider audience about the PLanT project and about the Language Learning Advisor scheme in general. It also gave the Advisors the opportunity to present at a staff forum.

Project outcomes

This project was a very positive experience. I was able to harness the enthusiasm and creativity of the three Advisors to develop a new student-based initiative which, in at least one case, confirmed an Advisor’s choice of teaching as a career path. The students receiving the support benefited through increased fluency, improved pronunciation and greater confidence; this was clear from their feedback comments, which included: ‘The small-group oral session is helping me a lot, [X] is very kind and patient’, ‘The [tutor] is very friendly. There is an obvious improvement in my pronunciation.’

I intend to continue to run these small-group skills-based sessions in future years, since I believe that they address a clearly-perceived and increasing need. The experience gained this year, together with the Advisors’ reflections and information about materials and activities employed, will be of great value in achieving this end.

Involving students in the appraisal of rubrics for performance-based assessment in Foreign Languages By Dott. Rita Balestrini

Context

In 2016, in the Department of Modern Languages and European Studies (DMLES), it was decided that the marking schemes used to assess writing and speaking skills needed to be revised and standardised in order to ensure transparency and consistency of evaluation across different languages and levels. A number of colleagues teaching language modules had a preliminary meeting to discuss what changes had to be made, what criteria to include in the new rubrics and whether the new marking schemes would apply to all levels. While addressing these questions, I developed a project with the support of the Teaching and Learning Development Fund. The project, now in its final stage, aims to enhance the process of assessing writing and speaking skills across the languages taught in the department. It intends to make assessment more transparent, understandable and useful for students; foster their active participation in the process; and increase their uptake of feedback.

The first stage of the project involved:

  • a literature review on the use of standard-based assessment, assessment rubrics and exemplars in higher education;
  • the organization of three focus groups, one for each year of study;
  • the development of a questionnaire, in collaboration with three students, based on the initial findings from the focus groups;
  • the collection of exemplars of written and oral work to be piloted for one Beginners language module.

I had a few opportunities to disseminate some key ideas emerged from the literature review – School of Literature and Languages’ assessment and feedback away day, CQSD showcase and autumn meeting of the Language Teaching Community of Practice. Having only touched upon the focus groups at the CQSD showcase, I will describe here how they were organised, run and analysed and will summarise some of the insights gained.

Organising and running the focus groups

Focus groups are a method of qualitative research that has become increasingly popular and is often used to inform policies and improve the provision of services. However, the data generated by a focus group are not generalisable to a population group as a whole (Barbour, 2007; Howitt, 2016).

After attending the People Development session on ‘Conducting Focus groups’, I realised that the logistics of their organization, the transcription of the discussion and the analysis of the data they generate require a considerable amount of time and detailed planning . Nonetheless, I decided to use them to gain insights into students’ perspectives on the assessment process and into their understanding of marking criteria.

The recruitment of participants was not a quick task. It involved sending several emails to students studying at least one language in the department and visiting classrooms to advertise the project. In the end, I managed to recruit twenty-two volunteers: eight for Part I, six for Part II and eight for Part III. I obtained their consent to record the discussions and use the data generated by the analysis. As a ‘thank you’ for participating, students received a £10 Amazon voucher.

Each focus group lasted one hour, the discussions were entirely recorded and were based on the same topic guide and stimulus material. To open discussion, I used visual stimuli and asked the following question:

  • In your opinion, what is the aim of assessment?

In all three groups, this triggered some initial interaction directly with me. I then started picking up on differences between participants’ perspectives, asking for clarification and using their insights. Slowly, a relaxed and non-threatening atmosphere developed and led to more spontaneous and natural group conversation, which followed different dynamics in each group. I then began to draw on some core questions I had prepared to elicit students’ perspectives. During each session, I took notes on turn-taking and some relevant contextual clues.

I ended all the three focus group sessions by asking participants to carry out a task in groups of 3 or 4. I gave each group a copy of the marking criteria currently used in the department and one empty grid reproducing the structure of the marking schemes. I asked them the following question:

  • If you were given the chance to generate your own marking criteria, what aspects of writing/speaking /translating would you add or eliminate?

I then invited them to discuss their views and use the empty grid to write down the main ideas shared by the members of their group. The most desired criteria were effort, commitment, and participation.

Transcribing and analysing the focus groups’ discussions

Focus groups, as a qualitative method, are not tied to any specific analytical framework, but qualitative researchers warn us not to take the discourse data at face value (Barbour, 2007:21). Bearing this in mind, I transcribed the recorded discussions and chose discourse analysis as an analytical framework to identify the discursive patterns emerging from students’ spoken interactions.

The focus of the analysis was more on ‘words’ and ‘ideas’ rather than on the process of interaction. I read and listened to the discussions many times and, as I identified recurrent themes, I started coding some excerpts. I then moved back and forth between the coding frame and the transcripts, adding or removing themes, renaming them, reallocating excerpts to different ‘themes’.

Spoken discourse lends itself to multiple levels of analysis, but since my focus was on students’ perspectives on the assessment process and their understanding of marking criteria, I concentrated on those themes that seemed to offer more insights into these specific aspects. Relating one theme to the other helped me to shed new light on some familiar issues and to reflect on them in a new way.

Some insights into students’ perspectives

As language learners, students gain personal experience of the complexity of language and language learning, but the analysis suggests that they draw on the theme of complexity to articulate their unease with the atomistic approach to evaluation of rubrics and, at times, also to contest the descriptors of the standard for a first level class. This made me reflect about whether the achievement of almost native-like abilities is actually the standard against which we want to base our evaluation. Larsen-Freeman’s (2015) and Kramsch’s (2008) approach to language development as a ‘complex system’ helped me to shed light on the idea of ‘complexity’ and ‘non-linear relations’ in the context of language learning which emerged from the analysis.

The second theme I identified is the ambiguity and vagueness of the standards for each criterion. Students draw on this theme not so much to communicate their lack of understanding of the marking scheme, but to question the reliability of a process of evaluation that matches performances to numerical values by using opaque descriptors.

The third theme that runs through the discussions is the tension between the promise of objectivity of the marking schemes and the fact that their use inevitably implies an element of subjectivity. There is also a tension between the desire for an objective counting of errors and the feeling that ‘errors’ need to be ‘weighted’ in relation to a specific learning context and an individual learning path. On one hand, there is the unpredictable and infinite variety of complex performances that cannot easily be broken down into parts in order to be evaluated objectively, on the other hand, there is the expectation that the sum of the parts, when adequately mapped to clear marking schemes, results in an objective mark.

Rubrics in general seem to be part of a double discourse. They are described as unreliable, discouraging and disheartening as an instructional tool. The feedback they provide is seen as having no effect on language development as does the complex and personalised feedback that teachers provide. Effective and engaging feedback is always associated with the expert knowledge of a teacher, not with rubrics. However, the need for rubrics as a tool of evaluation is not questioned in itself.

The idea of using exemplars to pin down standards and make the process of evaluation more objective emerges from the Part III focus group discussion. Students considered pros and cons of using exemplars drawing on the same rationales that can be found debated in scholarly articles. Listening to, and reading systematically through, students’ discourses was quite revealing and brought to light some questionable views on language and language assessment that most marking schemes measuring achievement in foreign languages contribute to promote.

Conclusion

The insights into students’ perspectives gained from the analysis of the focus groups suggest that rubrics can easily create false expectations in students and foster an assessment ‘culture’ based on an idea of learning as steady increase in skills. We need to ask ourselves how we could design marking schemes that communicate a more realistic view of language development. Could we create marking schemes that students do not find disheartening or ineffective in understanding how to progress? Rather than just evaluation tools, rubrics should be learning tools that describe different levels of performance and avoid evaluative language.

However, the issues of ‘transparency’ and ‘reliability’ cannot be solved by designing clearer, more detailed or student-friendly rubrics. These issues can only be addressed by sharing our expert knowledge of ‘criteria’ and ‘standards’ with students, which can be achieved through dialogue, practice, observation and imitation. Engaging students in marking exercises and involving them in the construction of marking schemes – for example by asking them how they would measure commonly desired criteria like effort and commitment – offers us a way forward.

References:

Barbour, R. 2007. Doing focus groups. London: Sage.

Howitt, D. 2016. Qualitative Research Methods in Psychology. Harlow: Pearson.

Kramsch, C. 2008. Ecological perspectives on foreign language education. Language Teaching 41 (3): 389-408.

Larsen-Freeman, D. 2015. Saying what we mean: Making a case for ‘language acquisition’ to become ‘language development’. Language Teaching 48 (4): 491-505.

Potter, M. and M. Wetherell. 1987. Discourse and social psychology. Beyond attitudes and behaviours. London: Sage.

 

Links to related posts

‘How did I do?’ Finding new ways to describe the standards of foreign language performance. A follow-up project on the redesign of two marking schemes (DLC)

Working in partnership with our lecturers to redesign language marking schemes 

Sharing the ‘secrets’: Involving students in the use (and design?) of marking schemes