Assessing the well-being of an animal is hindered by the limitations of efficient communication between humans and animals. Instead of direct communication, a variety of behavioral, biochemical, physiological, and physical parameters are employed to evaluate the well-being of an animal. Especially in the field of biomedical research, scientifically sound tools to assess pain, suffering, and distress for experimental animals are highly demanded due to ethical and legal reasons. For mice, the most commonly used laboratory animals, a valuable tool is the Mouse Grimace Scale (MGS), a coding system for facial expressions of pain in mice which has been shown to be accurate and reliable. Currently, MGS scoring is very time and effort consuming as it is manually performed by humans being thoroughly trained in using this method. Therefore, we aim to develop a fully automated system for the surveillance of well-being in mice.March 14, 2019 1/30Our work introduces a semi-automated pipeline as a first step towards this goal. We use and provide a new data set of images of black-furred laboratory mice that were moving freely, thus the images contain natural variation with regard to perspective and background. The analysis of this data set is therefore more challenging but reflects realistic conditions as it would be obtainable without human intervention. Images were obtained after anesthesia (with isoflurane or ketamine/xylazine combination) and surgery (castration). We deploy two pre-trained state of the art deep convolutional neural network (CNN) architectures (ResNet50 and InceptionV3) andcompare to a third CNN architecture without pre-training. Depending on the particular treatment, we achieve an accuracy of up to 99% for binary "pain"/"no-pain" classification.
Author summaryIn the field of animal research, it is crucial to assess the well-being of an animal. For mice, the most commonly used laboratory animals, there is a variety of indicators for well-being.Especially the facial expression of a mouse can give us important information on its well-being state. However, currently the surveillance of well-being can only be ensured if a human is present. Therefore, we developed a first approach towards a fully automated surveillance of the well-being status of a mouse. We trained neural networks on face images of black-furred mice, which were either untreated or underwent anesthesia or surgery, to distinguish between an impaired and unimpaired well-being state. Our systems successfully learnt to assess whether the well-being of a mouse was impaired and, depending on the particular treatment, its decision was correct in up to 99%. A tool that visualizes the features used for the decision making process indicated that the decision was mainly based on the facial expressions of a mouse.