As a common mental disorder, depression has attracted many researchers from affective computing field to estimate the depression severity. However, existing approaches based on Deep Learning (DL) are mainly focused on single facial image without considering the sequence information for predicting the depression scale. In this paper, an integrated framework, termed DepNet, for automatic diagnosis of depression that adopts facial images sequence from videos is proposed. Specifically, several pretrained models are adopted to represent the low-level features, and Feature Aggregation Module is proposed to capture the high-level characteristic information for depression analysis.More importantly, the discriminative characteristic of depression on faces can be mined to assist the
As a mental disorder, depression has affected people's lives, works, and so on. Researchers have proposed various industrial intelligent systems in the pattern recognition field for audiovisual depression detection. This paper presents an end‐to‐end trainable intelligent system to generate high‐level representations over the entire video clip. Specifically, a three‐dimensional (3D) convolutional neural network equipped with a module spatiotemporal feature aggregation module (STFAM) is trained from scratch on audio/visual emotion challenge (AVEC)2013 and AVEC2014 data, which can model the discriminative patterns closely related to depression. In the STFAM, channel and spatial attention mechanism and an aggregation method, namely 3D DEP‐NetVLAD, are integrated to learn the compact characteristic based on the feature maps. Extensive experiments on the two databases (i.e., AVEC2013 and AVEC2014) are illustrated that the proposed intelligent system can efficiently model the underlying depression patterns and obtain better performances over the most video‐based depression recognition approaches. Case studies are presented to describes the applicability of the proposed intelligent system for industrial intelligence.
Background Noninvasive preoperative prediction of histological grading is essential for clinical management of cerebral glioma. Purpose This study aimed to investigate the association between the image quality assessment of 1H magnetic resonance spectroscopy and accurate grading of glioma. Materials and Methods 98 glioma patients confirmed by pathology were retrospectively recruited in this single-center study. All patients underwent 1H-MRS examination at 3.0T before surgery. According to WHO standards, all cases were divided into two groups: low-grade glioma (grade I and II, 48 cases) and high-grade glioma (grades III and IV, 50 cases). The metabolite ratios in both grades were calculated before and after image quality assessment. The area under the receiver operating characteristic (ROC) curve was used to evaluate the capacity of each ratio in glioma grading. Results The Cho/Cr, Cho/NAA and NAA/Cr metabolite ratios had certain differences in each glioma group before and after MRS image quality assessment. In the low-grade glioma group, there was a dramatic difference in the Cho/Cr ratio before and after image quality assessment ( p = 0.011). After MRS image quality assessment, the accuracy of glioma grading was significantly improved. The Cho/Cr ratio with 83.3% sensitivity and 93.7% specificity is the best index of glioma grading, with the optimal cutoff value of the Cho/Cr ratio being 3.72. Conclusion The image quality of MRS does affect the metabolite ratios and the results of glioma grading. MRS image quality assessment can observably improve the accuracy rate of glioma grading. The Cho/Cr ratio has the best diagnostic performance in differentiating high-grade from low-grade glioma.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.