Delivering innovative personalised feedback for large multi-station practical assessments Team members: Helen Rodgers, David Hope, Helen Cameron Abstract This revised proposal aims to deliver improved feedback to over 500 students on Objective Structured Clinical Examinations (OSCEs). Improving feedback is a priority for the university and the medical school, and we have identified that many other medical schools are already delivering detailed feedback in multi-station practical assessments. We have developed - but not yet implemented - a method to provide students with written feedback comments on these ‘must-pass’ assessments but the effect of such methods is unknown. We must test the system, evaluate its impact and ensure it meets student needs. This project will implement and monitor examiner feedback training, evaluate a randomised sample of feedback sheets for quality and depth, explore student views through focus groups, questionnaires and class meetings, and examine any future performance improvements derived from the new feedback. Due to the nature of the project once the two year start-up phase is complete and training materials finalised the project will be completely sustainable with no further investment. The project will for the first time provide personalised, effective feedback to the whole year group in a timely manner. The findings will be disseminated throughout the university and via publications. Final Project Report Final Report may be downloaded using link below: Final Project Report (PDF) Other Project Outcome Amee 2014 conference presentation (PDF) This article was published on 2024-02-26