2021
DOI: 10.1007/978-3-030-86517-7_5
|View full text |Cite
|
Sign up to set email alerts
|

ConCAD: Contrastive Learning-Based Cross Attention for Sleep Apnea Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 32 publications
0
4
0
Order By: Relevance
“…Our method performed better than other existing methods in all the evaluated metrics except the Sn and F1 metrics, which were slightly worse (1%) compared to the method proposed by Almutairi et al (2021). Several studies using extended segments, such as SE-MSCNN (Chen et al 2022b), ConCAD (Huang and Ma 2021), and CNN-BiGRU (Chen et al 2022a), performed well, suggesting a positive effect of feeding the original and extended segments together into an effective deep-shallow fusion network.…”
Section: Apnea-ecg Datasetmentioning
confidence: 75%
“…Our method performed better than other existing methods in all the evaluated metrics except the Sn and F1 metrics, which were slightly worse (1%) compared to the method proposed by Almutairi et al (2021). Several studies using extended segments, such as SE-MSCNN (Chen et al 2022b), ConCAD (Huang and Ma 2021), and CNN-BiGRU (Chen et al 2022a), performed well, suggesting a positive effect of feeding the original and extended segments together into an effective deep-shallow fusion network.…”
Section: Apnea-ecg Datasetmentioning
confidence: 75%
“…The authors of [177] proposed to use both expert features and DNN features. They proposed a contrastive-based cross-attention model to predict sleep apnea through ECG signals.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…Then, the fused feature vector goes through the projection for contrastive loss and it is passed to the classifiers for apnea detection. [177]. Image taken from [177].…”
Section: Contrastive Learningmentioning
confidence: 99%
“…While DL has been effectively employed in many neuroscience applications, challenges remain with the quality and availability of the data (Banville et al, 2021;Younes, 2017;Huang and Ma, 2021). Recently, contrastive learning has shown to be an effective Self-Supervised Learning (SSL) technique to address the issues of the limitation of the data availability, noisy labels, and noisy data (He et al, 2020;Zbontar et al, 2021;Grill et al, 2020).…”
Section: Introductionmentioning
confidence: 99%