2022
DOI: 10.1609/aaai.v36i5.20472
|View full text |Cite
|
Sign up to set email alerts
|

Open Vocabulary Electroencephalography-to-Text Decoding and Zero-Shot Sentiment Classification

Abstract: State-of-the-art brain-to-text systems have achieved great success in decoding language directly from brain signals using neural networks. However, current approaches are limited to small closed vocabularies which are far from enough for natural communication. In addition, most of the high-performing approaches require data from invasive devices (e.g., ECoG). In this paper, we extend the problem to open vocabulary Electroencephalography(EEG)-To-Text Sequence-To-Sequence decoding and zero-shot sentence sentimen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(21 citation statements)
references
References 40 publications
0
21
0
Order By: Relevance
“…Multimodal Learning of Language and Other Brain Signals Recently, language and cognitive data were also used together in multimodal settings to complete desirable tasks (Wang and Ji, 2021;Hollenstein et al, 2019Hollenstein et al, , 2021Hollenstein et al, , 2020a. Wehbe et al (2014) used a recurrent neural network to perform word alignment between MEG activity and the generated word embeddings.…”
Section: D6 Baseline Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Multimodal Learning of Language and Other Brain Signals Recently, language and cognitive data were also used together in multimodal settings to complete desirable tasks (Wang and Ji, 2021;Hollenstein et al, 2019Hollenstein et al, , 2021Hollenstein et al, , 2020a. Wehbe et al (2014) used a recurrent neural network to perform word alignment between MEG activity and the generated word embeddings.…”
Section: D6 Baseline Resultsmentioning
confidence: 99%
“…Foster et al (2021) applied EEG signals to predict specific values of each dimension in a word vector through regression models. Wang and Ji (2021) used word-level EEG features to decode corresponding text tokens through an open vocabulary, sequence-to-sequence framework. Hollenstein et al (2021) focused on a multimodal approach by utilizing a combination of EEG, eye-tracking, and text data to improve NLP tasks.…”
Section: D6 Baseline Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The abundant data in the dataset can facilitate the utilization of modern data-driven methods from NLP in language related tasks, such as training large-scale models to learn the complex semantic patterns in neural signals, and aligning neural signals with natural languages in the representation space. For example, by using large-scale neural data to train deep learning models, these models can effectively learn the complex semantic representations of the brain under linguistic stimuli and generalize well across a wide range of downstream tasks, such as semantic decoding 39 , text-based emotion recognition 40 and sentiment classification 41 . It can also mitigate the challenge of inter-subject generalization in BCI systems caused by the variability of neural signals among individuals.…”
Section: Usage Notesmentioning
confidence: 99%
“…Techniques such as model pruning, quantization, and knowledge distillation have proven effective in decreasing the model size and accelerating inference speeds without significantly compromising accuracy [1,2]. Model pruning involves removing redundant or less important parameters from the network, leading to a more efficient model with fewer computational requirements [3]. Quantization techniques convert the model weights and activations from floating-point to lower-bit representations, thus reducing the memory footprint and computational load [4,5].…”
Section: Optimizing Inference In Large Language Modelsmentioning
confidence: 99%