2023
DOI: 10.48550/arxiv.2301.03441
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

L-SeqSleepNet: Whole-cycle Long Sequence Modelling for Automatic Sleep Staging

Abstract: Human sleep is cyclical with a period of approximately 90 minutes, implying long temporal dependency in the sleep data. Yet, exploring this long-term dependency when developing sleep staging models has remained untouched. In this work, we show that while encoding the logic of a whole sleep cycle is crucial to improve sleep staging performance, the sequential modelling approach in existing state-of-the-art deep learning models are inefficient for that purpose. We thus introduce a method for efficient long seque… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 27 publications
0
1
0
Order By: Relevance
“…The last 4-layers of the TF encoder process the outer sequence i.e., the interactions between these aggregated features of neighbouring windows. We allow for a maximum of 21 sequential windows, following [45,44], but recent works indicate that even greater numbers could be beneficial [42]. The Inner-Outer scheme is visualised in Figure 1b.…”
Section: Core-sleep Architecturementioning
confidence: 99%
“…The last 4-layers of the TF encoder process the outer sequence i.e., the interactions between these aggregated features of neighbouring windows. We allow for a maximum of 21 sequential windows, following [45,44], but recent works indicate that even greater numbers could be beneficial [42]. The Inner-Outer scheme is visualised in Figure 1b.…”
Section: Core-sleep Architecturementioning
confidence: 99%