2023
DOI: 10.1109/jbhi.2023.3242262
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning With Convolutional Neural Networks for Motor Brain-Computer Interfaces Based on Stereo-Electroencephalography (SEEG)

Abstract: Deep learning based on convolutional neural networks (CNN) has achieved success in brain-computer interfaces (BCIs) using scalp electroencephalography (EEG). However, the interpretation of the so-called 'black box' method and its application in stereo-electroencephalography (SEEG)-based BCIs remain largely unknown. Therefore, in this paper, an evaluation is performed on the decoding performance of deep learning methods on SEEG signals. Methods: Thirty epilepsy patients were recruited, and a paradigm including … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 38 publications
0
5
0
Order By: Relevance
“…As another transformer-based GAN study showed that the transformer alone led to obvious highfrequency artefacts in the generated data [75], an extra filter layer was added to the transformer model to further regulate the generated data in the spectral domain. On the other hand, it has been demonstrated that a CNN can act as a filter in EEG decoding tasks [73,80]. This has been validated with an ablation study by checking the spectral contents of data generated with and without the filter layer.…”
Section: The Extra Filter Layermentioning
confidence: 86%
See 1 more Smart Citation
“…As another transformer-based GAN study showed that the transformer alone led to obvious highfrequency artefacts in the generated data [75], an extra filter layer was added to the transformer model to further regulate the generated data in the spectral domain. On the other hand, it has been demonstrated that a CNN can act as a filter in EEG decoding tasks [73,80]. This has been validated with an ablation study by checking the spectral contents of data generated with and without the filter layer.…”
Section: The Extra Filter Layermentioning
confidence: 86%
“…Therefore, before conducting the experiment using the proposed and baseline methods, a sliding-window strategy was adopted to augment the SEEG data as a starting point [17]. The window and sliding step were, respectively, set to 500 ms and 100 ms (sampling rate of 1000 Hz) [80]. This windowing process split the original trials (10 s) into multiple shorter subtrials (500 ms).…”
Section: Da Model Training Processmentioning
confidence: 99%
“…This process is repeated until the desired number of channels is selected. To extract features during the computation of the MI, we extract spectral features in 0.5-4 Hz, 4-8 Hz, 8-13 Hz, 13-30 Hz, 60-75 Hz, 75-95 Hz, 105-125 Hz, and 125-150 Hz, as in our previous study [35]. Note that these extracted features were only used to select channels, while movement classification was performed using raw SEEG signals of the selection channels.…”
Section: B Channel Selection Methodsmentioning
confidence: 99%
“…Then, channels showing high ERS or ERD by visual inspection during the task stage were chosen. The calculation of ERS/ERD and the procedure to sort the channels according to the electrode reactivity was described in our previous work [35].…”
Section: B Channel Selection Methodsmentioning
confidence: 99%
“…Deep neural networks are adept at learning hierarchical representations of features from transactional and user behavior data [52]. CNNs are effective for spatial data, such as images associated with fraud detection [53], while RNNs are valuable for capturing temporal dependencies in sequential data [54]. Transformer models, originally developed for natural language processing, have shown promise in capturing long-range dependencies and contextual information relevant to fraud detection [55,56].…”
Section: Deep Learning Approaches To Fraud Detectionmentioning
confidence: 99%