In this paper, we designed a methodology to classify facial nerve function after head and neck surgery. It is important to be able to observe the rehabilitation process objectively after a specific brain surgery, when patients are often affected by face palsy. The dataset that is used for classification problems in this study only contains 236 measurements of 127 patients of complex observations using the most commonly used House–Brackmann (HB) scale, which is based on the subjective opinion of the physician. Although there are several traditional evaluation methods for measuring facial paralysis, they still suffer from ignoring facial movement information. This plays an important role in the analysis of facial paralysis and limits the selection of useful facial features for the evaluation of facial paralysis. In this paper, we present a triple-path convolutional neural network (TPCNN) to evaluate the problem of mimetic muscle rehabilitation, which is observed by a Kinect stereovision camera. A system consisting of three modules for facial landmark measure computation and facial paralysis classification based on a parallel convolutional neural network structure is used to quantitatively assess the classification of facial nerve paralysis by considering facial features based on the region and the temporal variation of facial landmark sequences. The proposed deep network analyzes both the global and local facial movement features of a patient’s face. These extracted high-level representations are then fused for the final evaluation of facial paralysis. The experimental results have verified the better performance of TPCNN compared to state-of-the-art deep learning networks.
Gait and motion stability analysis in gait dysfunction problems is a very interesting research area. Usually, patients who undergo vestibular deafferentation are affected by changes in their dynamic balance. Therefore, it is important both patients and physicians are able to monitor the progress of the so-called vestibular compensation to observe the rehabilitation process objectively. Currently, the quantification of their progress is highly dependent on the physician’s opinion. In this article, we designed a novel methodology to classify the gait disorders associated with unilateral vestibular deafferentation in patients undergoing vestibular schwannoma surgery (model of complete vestibular loss associated with imbalance due to vestibular nerve section and eventual labyrinthectomy). We present a dual-head attentional transformer-LSTM (DHAT-LSTM) to evaluate the problem of rehabilitation from gait dysfunction, which is observed by a Kinect. A system consisting of a key-point-RCNN detector is used to compute body landmark measures and evaluate gait dysfunction based on a DHAT-LSTM network. This structure is used to quantitatively assess gait classification by tracking skeletal features based on the temporal variation of feature sequences. The proposed deep network analyses the features of the patient’s movement. These extracted high-level representations are then fed to the final evaluation of gait dysfunction. The result analytically demonstrates its effectiveness in classification evaluation when used in conjunction with state-of-the-art pose estimation and feature extraction techniques. An accuracy greater than 81% was achieved for given sets of individuals using velocity-based, angle-based, and position features for both the whole body and the symmetric features of the body.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.