This version is available at https://strathprints.strath.ac.uk/54353/ Strathprints is designed to allow users to access the research output of the University of Strathclyde. Unless otherwise explicitly stated on the manuscript, Copyright © and Moral Rights for the papers on this site are retained by the individual authors and/or other copyright owners. Please check the manuscript for details of any other licences that may have been applied. You may not engage in further distribution of the material for any profitmaking activities or any commercial gain. You may freely distribute both the url (https://strathprints.strath.ac.uk/) and the content of this paper for research or private study, educational, or not-for-profit purposes without prior permission or charge.Any correspondence concerning this service should be sent to the Strathprints administrator: strathprints@strath.ac.ukThe Strathprints institutional repository (https://strathprints.strath.ac.uk) is a digital archive of University of Strathclyde research outputs. It has been developed to disseminate open access research outputs, expose data about those outputs, and enable the management and persistent access to Strathclyde's intellectual output. Abstract Biofeedback from physical rehabilitation exercises has proved to lead to faster recovery, better outcomes, and increased patient motivation. In addition, it allows the physical rehabilitation processes carried out at the clinic to be complemented with exercises performed at home. However, currently existing approaches rely mostly on audio and visual reinforcement cues, usually presented to the user on a computer screen or a mobile phone interface. Some users, such as elderly people, can experience difficulties to use and understand these interfaces, leading to non-compliance with the rehabilitation exercises. To overcome this barrier, latest biosignal technologies can be used to enhance the efficacy of the biofeedback, decreasing the complexity of the user interface. In this paper we propose and validate a contextaware framework for the use of animatronic biofeedback, as a way of potentially increasing the compliance of elderly users with physical rehabilitation exercises performed at home. in the scope of our work, animatronic biofeedback entails the use of pre-programmed actions on a robot that are triggered in response to certain changes detected in the users' biomechanical or electrophysiological signals. We use electromyographic and accelerometer signals, collected in real time, to monitor the performance of the user while execut- ing the exercises, and a mobile robot to provide animatronic reinforcement cues associated with their correct or incorrect execution. A context-aware application running on a smartphone aggregates the sensor data and controls the animatronic feedback. The acceptability of the animatronic biofeedback has been tested on a set of volunteer elderly users. The results suggest that the participants found the animatronic feedback engaging and of added value.