Improved Neural Network assessment by staged laboratory practicals

Professor Richard Mitchell, School of Systems Engineering


6470Adjustments were made to teaching, assessment, and feedback in a Part Two module within the School of Systems Engineering, Neural Networks (SE2NN11), successfully using three-staged laboratory practicals in order to encourage students to use neural networks on a ‘real world’ application. Making these changes saw an increase in the number of students successfully producing a neural network.


  • Increase the number of students successfully producing a neural network.
  • Provide greater and prompter feedback to students.


The major assessment for SE2NN11 requires students to write a program to implement a particular neural network and to then use that network on a ‘real world’ application. The students demonstrated their network by the end of the autumn term, where verbal feedback was given, and they then applied it to the real world problem of their choice in the next term. Previously, students had difficulty with the first stage, and so fewer moved on to the (more interesting) second stage, with only around 75% of students submitting a report.


During the pilot year (2009), the tasks associated with writing the neural network were carefully divided into three, and three associated 90 minute lab sessions were organised, two weeks apart, for the work. The lecturer plus two laboratory demonstrators were available to provide help to the students at these sessions.

For each session, a Microsoft Word template file was provided, and the students copied and pasted relevant program output or small parts of the program (functions) into the appropriate parts of the template. A simple marking scheme was associated with each part, worth 30 marks: typically students could get 0, 1 or 2 for a piece of code plus 0 or 1 for comments; or a student could get 0 or 1 depending on whether the program output was correct. There was also space for comments to be written.

These files were then submitted to the lecturer who circled the relevant mark for each part and added relevant comments. Detailed feedback was thereby generated very easily and very quickly. The aim was to give feedback within a week of the session, allowing the student a further week to make any necessary corrections on one part before starting the next part of the program. In fact, the first week’s work was marked within two days.


Each year since this scheme has been introduced, around 95% of students have been able to produce a neural network, a significant increase in submissions.


The impression obtained in the pilot was that a greater proportion of students had a working neural network compared with previous years, suggesting a great success of this scenario. As such the scenario has been used each year since with some changes to the templates (and to the program to help reduce plagiarism between years).

One problem is that the structure of the program is so tightly defined that there is little scope for variation in code – hence copying is difficult to detect. This is partly addressed by requiring the student to comment their code and to discuss the object-oriented aspects of their program in the final report. In addition students were expected to do experiments in their own time to investigate the effects of changing specific parameters in the program. The instructions for the final report were made clearer to try to ensure this happens.

One disadvantage of the approach is that it discourages independent thought more than is ideal. The much increased submission rate, however, is encouraging.

The important aspects of this scheme are the division of the project into suitable, easily marked sub-tasks, the extra support provided in the development of the program and the inherent feedback between sessions.

Follow up

Following the pilot year, it was realised that some functions were more complex than others, so the marking scheme was changed so that an appropriate number of marks for the code and comments was available for each function.

From 2015, with the move to online submission, students upload their document to Blackboard Learn, where the work is readily marked. Rather than circling the marks, the ‘insert text’ option is used to allow the marks to be entered. Comments on errors or suggestions for improvements are also easily added in an appropriate context.

The second part of the assessment requires the student to apply their neural network to a ‘real world’ problem of their choice, to see if the network can learn that problem. In effect the students are researching whether a neural network is appropriate. Given that, rather than asking the students to write a report on their work, they are now asked to present their research in the form of a four page conference paper. This tests them with a new skill, complementing the report writing skills they use elsewhere. This innovation has also proved successful.