Abstract—Objective: : Deep learning is increasingly used for Brain-computer interfaces (BCIs). However, the available brain data is sparse, especially for invasive BCIs, which can dramatically deteriorate deep learning performance. Data augmentation methods (DA), such as generative models, can help to address this issue. However, existing studies on brain signals relied on convolutional neural networks (CNNs) and ignored the temporal dependence. This paper tried to enhance the generative model by capturing the temporal relationship from a time-series perspective.
Methods: A conditional generative network (cTGAN) based on the transformer model is proposed, and tested on the Stereo- electroencephalography (SEEG) data which was recorded from eight epileptic patients performing five different movements. Three other commonly-used DA methods were also implemented: noise injection (NI), variational autoencoder (VAE) and conditional Wasserstein GAN with gradient penalty (cWGANGP). Artificial SEEG data was generated, and various metrics were used to compare the data quality, including visual inspection, Cosine similarity (CS), Jensen-Shannon distance (JSD) and the effect on the performance of a deep learning-based classifier.
Result: Both the proposed cTGAN and the cWGANGP methods were able to generate realistic data, while NI and VAE output inferior samples when visualised as raw sequences and in a lower dimensional space. The cTGAN generated the best samples in terms of cosine similarity and Jensen-Shannon distance and outperformed cWGANGP significantly in enhancing the performance of a deep learning-based classifier (each of them yielding a significant improvement of 6% and 3.4%, respectively).
Conclusion: This paper demonstrated that a generative model that preserves temporal dependence is superior in data generation and boosting deep learning performance for SEEG signals.
Significance: This is the first time that DA methods are applied to invasive BCIs based on SEEG. In addition, this study demonstrated the advantages of the model that preserves the temporal dependence from a time-series perspective.