2016
DOI: 10.1007/978-81-322-3667-2_10
|View full text |Cite
|
Sign up to set email alerts
|

More Examples and Open Questions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 31 publications
0
1
0
Order By: Relevance
“…Advances in time series machine learning have focused recently on self-supervised, representation learning using the transformer architecture [35][36][37][38] to develop foundation models in various time series classification and forecasting tasks. Generally, these models use low frequency data, primarily < 1 Hz, limiting their fine-tuning ability to high frequency sleep datasets.…”
Section: Introductionmentioning
confidence: 99%
“…Advances in time series machine learning have focused recently on self-supervised, representation learning using the transformer architecture [35][36][37][38] to develop foundation models in various time series classification and forecasting tasks. Generally, these models use low frequency data, primarily < 1 Hz, limiting their fine-tuning ability to high frequency sleep datasets.…”
Section: Introductionmentioning
confidence: 99%