2020
DOI: 10.1101/2020.06.02.129114
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Decoding neural signals and discovering their representations with a compact and interpretable convolutional neural network

Abstract: Brain-computer interfaces (BCIs) decode information from neural activity and send it to external 1 devices. In recent years, we have seen an emergence of new algorithms for BCI decoding. Here 2 we propose a compact architecture for adaptive decoding of electrocorticographic (ECoG) data into 3 finger kinematics. We also describe a theoretically justified approach to interpreting the spatial 4 and temporal weights in the architectures that combine adaptation in both space and time, such as 5 ours. In hese archit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…Recently, researchers reported that using deep neural networks (DNN) for decoding can improve decoding accuracy. Studies have used Restricted Boltzmann machine [126], LSTM [127]- [129], Convolutional Neural Network (CNN) [130], [131], temporal convolutional network (TCN) [132], recurrent neural network (RNN), QuasiRNN [133], multiplicative RNN (MRNN) [134], gated recurrent unit network (GRU) [135], and combination of CNN and LSTM [118], [136] for discrete and continuous movement-related parameters decoding. DNN models need to be trained by large-scale datasets in order to avoid overfitting and obtain high accuracies [137], [138].…”
Section: Barroso Et Al (2019) [34] Ratmentioning
confidence: 99%
“…Recently, researchers reported that using deep neural networks (DNN) for decoding can improve decoding accuracy. Studies have used Restricted Boltzmann machine [126], LSTM [127]- [129], Convolutional Neural Network (CNN) [130], [131], temporal convolutional network (TCN) [132], recurrent neural network (RNN), QuasiRNN [133], multiplicative RNN (MRNN) [134], gated recurrent unit network (GRU) [135], and combination of CNN and LSTM [118], [136] for discrete and continuous movement-related parameters decoding. DNN models need to be trained by large-scale datasets in order to avoid overfitting and obtain high accuracies [137], [138].…”
Section: Barroso Et Al (2019) [34] Ratmentioning
confidence: 99%
“…Among them LSTMs are the most commonly used because they are able to learn long-range dependencies better than other recurrent structures [1][2][3][4][5][6]. Convolutional Neural Networks (CNNs) are also frequently used for decoding neural signals in the form of fMRI image, calcium image or multi-channel EEG waves [7][8][9] because they are able to learn local dependencies of the data. Although most of these deep learning algorithms have achieved better performances compared with traditional machine learning methods, they still suffer from problems such as gradient vanishing and the weakness of extracting global features.…”
Section: Introductionmentioning
confidence: 99%