2021
DOI: 10.1109/tnsre.2021.3098968
|View full text |Cite
|
Sign up to set email alerts
|

RobustSleepNet: Transfer Learning for Automated Sleep Staging at Scale

Abstract: Sleep disorder diagnosis relies on the analysis of polysomnography (PSG) records. As a preliminary step of this examination, sleep stages are systematically determined. In practice, sleep stage classification relies on the visual inspection of 30second epochs of polysomnography signals. Numerous automatic approaches have been developed to replace this tedious and expensive task. Although these methods demonstrated better performance than human sleep experts on specific datasets, they remain largely unused in s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
51
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 78 publications
(52 citation statements)
references
References 24 publications
0
51
1
Order By: Relevance
“…13,19 By deciding the number of input epochs and corresponding output labels, strategies of model design can be divided into (i) one-to-one, [20][21][22][23] (ii) many-to-one, [24][25][26] and (iii) many-to-many. [27][28][29][30] Many recent papers are utilizing the many-to-many prediction strategy since it could achieve better performance by exploiting contextual input information and contextual output generation. We also recruited a many-to-many strategy for our proposed network.…”
Section: Discussionmentioning
confidence: 99%
“…13,19 By deciding the number of input epochs and corresponding output labels, strategies of model design can be divided into (i) one-to-one, [20][21][22][23] (ii) many-to-one, [24][25][26] and (iii) many-to-many. [27][28][29][30] Many recent papers are utilizing the many-to-many prediction strategy since it could achieve better performance by exploiting contextual input information and contextual output generation. We also recruited a many-to-many strategy for our proposed network.…”
Section: Discussionmentioning
confidence: 99%
“…There are several areas of transfer learning research which are underexplored for sleep staging. Much of supervised transfer learning is done using very simple methods such as re-training a few layers of the model (hereafter referred to as head re-training) or retraining the entire model at a smaller learning rate [13][14][15]. However, there are other more sophisticated transfer learning methods such as Correlation Alignment (CORAL) [16], Deep Domain Confusion (DDC) [17], and Subspace Alignment (SA) [18] which are rarely tested on sleep staging tasks.…”
Section: Limitations In Current Transfer Learning Researchmentioning
confidence: 99%
“… 2017 ) 82.0 0.760 76.9 86.4 0.805 82.2 FCNN+RNN (Phan et al. 2021 ) 83.5 0.775 77.7 86.4 0.806 82.1 88.1 0.832 80.9 TinySleepNet (Supratak and Guo 2020 ) 85.4 0.800 80.5 83.1 0.77 78.1 RobustSleepNet (Guillot and Thorey 2021 ) 81.7 82.5 80.0 SleepTransformer (Phan et al. 2022 ) 87.7 0.828 80.1 SeqSleepNet (Phan et al.…”
Section: Applicationsmentioning
confidence: 99%