2022
DOI: 10.2139/ssrn.4191945
|View full text |Cite
|
Sign up to set email alerts
|

EEG-Based Epileptic Seizure Prediction Using Temporal Multi-Channel Transformers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…Additionally, the performance metrics showed that the inclusion of convolutional layers and attention-based pooling in the model enhances the performance and reduces the number of Transformer encoder layers, significantly reducing the computational complexity. Similarly, [84] explores definitions of preictal and interictal states in the CHB-MIT Scalp EEG Database, evaluating 30 and 60 minutes before seizures. The interictal state spans 4 hours after the last seizure and 4 hours before the next, with a 5minute pre-seizure window for timely alerts.…”
Section: A Seizure Detectionmentioning
confidence: 99%
“…Additionally, the performance metrics showed that the inclusion of convolutional layers and attention-based pooling in the model enhances the performance and reduces the number of Transformer encoder layers, significantly reducing the computational complexity. Similarly, [84] explores definitions of preictal and interictal states in the CHB-MIT Scalp EEG Database, evaluating 30 and 60 minutes before seizures. The interictal state spans 4 hours after the last seizure and 4 hours before the next, with a 5minute pre-seizure window for timely alerts.…”
Section: A Seizure Detectionmentioning
confidence: 99%
“…In this sense, our work seeks to follow up on these efforts to understand epilepsy (de Godoy et al. 2022 ; Hramov et al. 2016 ; Mohseni et al.…”
Section: Introductionmentioning
confidence: 99%