Most doctors have never had their communication skills formally assessed and do not know how they compare with their peers. Glyn Elwyn and colleagues explain how AI might facilitate this and help improve interactions with patients
Automatic recognition of low-back chronic pain and movement behaviour in humans could be a useful technology in health monitoring and providing effective rehabilitation advice. Physical and muscle activity information can be used in automating this process in combination with machine learning and feature engineering methods. This paper presents a method for automatic recognition of chronic pain and movement behaviour using our recently proposed 'Active Data Representation' (ADR) method, and applies it to two tasks of the EmoPain 2020 Challenge using physical and muscle activity features. The ADR method is used for the transformation of the physical and muscle activity features for the classification tasks. Our results show that ADR outperforms the LSTM challenge baseline model in terms of Matthew correlation coefficient (0.43) and F score (61.21) for the recognition of chronic pain and movement behaviour respectively in hold-out validation settings. Although a decrease in performance is observed on the test dataset, ADR still outperforms the challenge baseline for the recognition of chronic pain and movement behaviour tasks.
We present a tool for visualization of transcripts of multi-party dialogues, with application to the analysis of communication in medical teamwork. The visualization is based on a "temporal mosaic" metaphor, which provides a temporal overview of dialogues and supports the tasks of transcript browsing and information access, by segmenting the dialogue and laying out the keywords of the different segments on interactive visual "tiles". The tool has been tested on a corpus of transcribed dialogues among the members of a (simulated) critical care team. An analytical evaluation is presented which demonstrates the potential uses of the tool in an educational setting and highlights areas for improvements.
Research in automatic emotion recognition has seldom addressed the issue of computational resource utilisation. With the advent of ambient technology, which employs a variety of low-power, resource constrained devices, this issue is increasingly gaining interest. This is especially the case in the context of health and elderly care technologies, where interventions aim at maintaining the user's independence as unobtrusively as possible. In this context, efforts are being made to model human social signals such as affects using low-cost technologies, which can aid health monitoring. This paper presents an Active Feature Selection (AFS) method using self-organized maps neural networks for emotion recognition in the wild. The AFS is used for feature subsets selection from three different feature sets: 62 out of 88 features were selected for eGeMAPs, 21 out of 988 for emobase, and 140 out of 2832 for LBPTOP features. The results show that the features subsets selected by AFS provide better results than the entire feature set and PCA dimensionality reduction method. The best improvement is observed on emobase features, followed by eGeMAPs. For visual features, nearly the same results are obtained with a significant reduction in dimensionality (only 5% of the full feature set is required for the same level of accuracy). The weighted score fusion results in an improvement, leading to 43.40% and 40.12% accuracies on the EmotiW 2018 validation and test datasets respectively.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.