2022
DOI: 10.1007/s11042-022-12372-7
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning for sleep stages classification: modified rectified linear unit activation function and modified orthogonal weight initialisation

Abstract: Background and AimEach stage of sleep can affect human health, and not getting enough sleep at any stage may lead to sleep disorder like parasomnia, apnea, insomnia, etc. Sleeprelated diseases could be diagnosed using Convolutional Neural Network Classifier. However, this classifier has not been successfully implemented into sleep stage classification systems due to high complexity and low accuracy of classification. The aim of this research is to increase the accuracy and reduce the learning time of Convoluti… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 15 publications
0
3
0
Order By: Relevance
“…The final classification is performed by a conditional random field module that provides a conditional probability for the possible sleep stages. An orthogonal CNN (OCNN) architecture was modified and its performance investigated in [25], where the authors included an alternate version of a rectified linear unit (ReLU) and the Adam optimizer in the OCNN model. The authors claimed that this modification enhances feature extraction and accuracy, and it also reduces the learning time.…”
Section: Related Workmentioning
confidence: 99%
“…The final classification is performed by a conditional random field module that provides a conditional probability for the possible sleep stages. An orthogonal CNN (OCNN) architecture was modified and its performance investigated in [25], where the authors included an alternate version of a rectified linear unit (ReLU) and the Adam optimizer in the OCNN model. The authors claimed that this modification enhances feature extraction and accuracy, and it also reduces the learning time.…”
Section: Related Workmentioning
confidence: 99%
“…[58] Normalization layer maintains regularity and avoids excess fitting, while simultaneously speeding up computation by the CNN [59] Rectified Linear Unit (ReLU) Eliminates all negative digits and substitutes their values with zero. [60] Pooling layer Retrieves values from segments of images bounded by kernels. [61] Fully connected layer Linearly transforms input vectors are linearly using weight matrices in order to solve problems.…”
Section: Layer Name Function Referencesmentioning
confidence: 99%
“…The study deployed decision tree, support vector machine, and random forest models that were used to extract and train on statistical characteristics with varying percentages of the testing dataset with the random forest model revealing a 97.8% accuracy score. Bhusal et al [ 14 ] worked to solve the gradient saturation issue introduced by the sigmoid activation function and to enhance the accuracy with which sleep stages may be classified. The suggested system employed a modified orthogonal convolutional neural network and a modified Adam optimization technique.…”
Section: Literature Reviewmentioning
confidence: 99%