Obstructive Sleep Apnea Syndrome (OSAS) and Major Depressive Disorder (MDD) are common conditions associated with poor quality of life. In this work, we aim to classify OSAS and depression in patients with OSAS using machine learning techniques. We have extracted features from electrocardiograms (ECG), electroencephalograms (EEG), and breathing signals from polysomnography (PSG) at specific 5-minute intervals, where the participants' statuses are known, meaning we do not need breathing signals. These statuses include sleep stage, whether or not they have depression, or an apneic event has occurred. The PSGs were recorded from a total of 118 subjects with a 75/25 split for training and testing and the resultant features were used in sleep staging and classifying OSAS and depression in OSAS patients. Sleep staging was best done with random forest without feature selection, yielding an accuracy of 70.52 % and F1-Score of 69.99 %. The best classification performance of OSAS happened during deep sleep without feature selection and SVM, which yielded an accuracy of 98.36 % and F1-Score of 98.82 %. All sleep stages with Chi 2 ANN yielded an accuracy of 72.95 % and F1-Score of 73.43 % for classification of depression in OSAS patients. Results show promise in detecting OSAS and depression in OSAS patients, and the Bland-Altman plot shows that posterior probability provides comparable means of detecting OSAS to the apnea-hypopnea index (AHI). Besides detection of OSAS in depressed patients, this work serves to classify depression and give insights into relevant sleep stages to both of those conditions, allowing better planning for polysomnography.INDEX TERMS Electroencephalography (EEG), electrocardiography (ECG), obstructive sleep apnea syndrome (OSAS), depression, sleep staging, machine learning.The associate editor coordinating the review of this manuscript and approving it for publication was Humaira Nisar .with metabolic diseases, mood disorders, reduced cognitive performance, increased risk of accidents, depression, memory loss, cardiovascular complications, such as arrhythmias, coronary heart disease, heart failure, and strokes among other effects [1], [2], [3]. OSAS is a relatively common condition of sleep-disordered breathing, affecting 3-7 % of men and 2-5% of women in the general population [4], increasing to about 24 % and 9 % respectively when polysomnographic criteria are solely considered [5], [6].Depression or Major Depressive Disorder (MDD) is a common mental disorder characterized by reduced production of 110916
Nonassociative learning is an important property of neural organization in both vertebrate and invertebrate species. In this paper we propose a neural model for nonassociative learning in a well studied prototypical sensory-motor scheme: the landing reaction of flies. The general structure of the model consists of sensory processing stages, a sensory-motor gate network, and motor control circuits. The paper concentrates on the sensory-motor gate network which has an agonist-antagonist structure. Sensory inputs to this circuit are transduced by chemical messenger systems whose dynamics include depletion and replenishment terms. The resulting circuit is a gated dipole anatomy and we show that it gives a good account of nonassociative learning in the landing reaction of the fly.
Genuineness of smiles is of particular interest in the field of human emotions and social interactions. In this work, we develop an experimental protocol to elicit genuine and fake smile expressions on 28 healthy subjects. Then, we assess the type of smile expressions using electroencephalogram (EEG) signals with convolutional neural networks (CNNs). Five different architectures (CNN1, CNN2, CNN3, CNN4, and CNN5) were examined to differentiate between fake and real smiles. We transform the temporal EEG signals into normalized gray-scale images and perform three-way classification to classify fake smiles, genuine smiles, and neutral expressions in the form of subject-dependent classification. We achieved the highest classification accuracy of 90.4% using CNN1 for the full EEG spectrum. Likewise, we achieved classification accuracies of 87.4%, 88.3%, 89.7%, and 90.0% using Beta, Alpha, Theta, and Delta EEG bands respectively. This paper suggests that CNNs models, widely used in image classification problems, can provide an alternative approach for smile detection from physiological signals such as the EEG.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.