2022 10th International Winter Conference on Brain-Computer Interface (BCI) 2022
DOI: 10.1109/bci53720.2022.9734950
|View full text |Cite
|
Sign up to set email alerts
|

SEEG signal processing methods in the application of epilepsy recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…LSTM networks, a subclass of recurrent neural networks (Figure 14), are pivotal in our project for their exceptional ability to process and remember information over extended periods, making them ideal for handling the sequential and temporal nature of EEG signals. Unlike traditional neural networks, LSTMs are designed to avoid the long-term dependency problem, enabling them to remember inputs for long durations with their unique architecture comprising three gates: input, output, and forget gates [39]. Figure 14 shows the structure of a cell unit in the LSTM network.…”
Section: Lstm Descriptionmentioning
confidence: 99%
“…LSTM networks, a subclass of recurrent neural networks (Figure 14), are pivotal in our project for their exceptional ability to process and remember information over extended periods, making them ideal for handling the sequential and temporal nature of EEG signals. Unlike traditional neural networks, LSTMs are designed to avoid the long-term dependency problem, enabling them to remember inputs for long durations with their unique architecture comprising three gates: input, output, and forget gates [39]. Figure 14 shows the structure of a cell unit in the LSTM network.…”
Section: Lstm Descriptionmentioning
confidence: 99%
“…The additional benefit of the synthetic procedure, as the authors point out, was deidentification of the original data and significant improvement in data privacy. Multiple additional groups have applied similar data augmentation approaches with various modifications, including different feature extraction methods, different generator and discriminator architectures, different loss functions, utilization of LSTM/GRU cells or attention instead of CNNs, and the application of different classifiers [126][127][128][129][130][131][132][133][134][135][136][137][138][139][140][141][142][143].…”
Section: Gans In Eeg Epilepsy Detectionmentioning
confidence: 99%