2023
DOI: 10.1007/s12652-023-04609-6
|View full text |Cite
|
Sign up to set email alerts
|

A channel-wise attention-based representation learning method for epileptic seizure detection and type classification

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 38 publications
0
2
0
Order By: Relevance
“…This path involves flipping the input data and applying a windowing technique to divide the input into smaller segments. These segments are then independently processed by an LSTM TimeDistributed layer, followed by channel-wise attention, inspired by [58], to enable the model to focus on the most relevant information. The attended signals are then passed through a simple LSTM layer with 64 neurons and a dense layer.…”
Section: Proposed Deep Learning Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…This path involves flipping the input data and applying a windowing technique to divide the input into smaller segments. These segments are then independently processed by an LSTM TimeDistributed layer, followed by channel-wise attention, inspired by [58], to enable the model to focus on the most relevant information. The attended signals are then passed through a simple LSTM layer with 64 neurons and a dense layer.…”
Section: Proposed Deep Learning Modelmentioning
confidence: 99%
“…Early stopping was used as a method of regularization technique to prevent overfitting. It monitors the validation loss and suspends training when there is no improvement over 10 consecutive epochs [28,58]. The effectiveness and reliability of performance rely on having a balance between a model's learning capacity as well as its generalizability [53].…”
Section: Model Architecture and Training Detailsmentioning
confidence: 99%