2019
DOI: 10.1007/978-3-030-21642-9_8
|View full text |Cite
|
Sign up to set email alerts
|

ChronoNet: A Deep Recurrent Neural Network for Abnormal EEG Identification

Abstract: Brain-related disorders such as epilepsy can be diagnosed by analyzing electroencephalograms (EEG). However, manual analysis of EEG data requires highly trained clinicians, and is a procedure that is known to have relatively low inter-rater agreement (IRA). Moreover, the volume of the data and the rate at which new data becomes available make manual interpretation a time-consuming, resource-hungry, and expensive process. In contrast, automated analysis of EEG data offers the potential to improve the quality of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
114
1
1

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 133 publications
(119 citation statements)
references
References 26 publications
3
114
1
1
Order By: Relevance
“…To the best of our knowledge, our work is the first attempt to study few-shot learning for TSC. We formulate the few-shot learning problem for UTSC, and build on top of the following recent advances in deep learning research to develop an effective few-shot approach for TSC: i) gradient-based meta-learning [11,24], ii) residual network with convolutional layers for TSC [40], iii) leveraging multi-length filters to ensure generalizability of filters to tasks with varying time series length and temporal properties [18,29], and iv) triplet loss [35] to ensure generalizability to tasks with varying number of classes without introducing any additional parameters.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…To the best of our knowledge, our work is the first attempt to study few-shot learning for TSC. We formulate the few-shot learning problem for UTSC, and build on top of the following recent advances in deep learning research to develop an effective few-shot approach for TSC: i) gradient-based meta-learning [11,24], ii) residual network with convolutional layers for TSC [40], iii) leveraging multi-length filters to ensure generalizability of filters to tasks with varying time series length and temporal properties [18,29], and iv) triplet loss [35] to ensure generalizability to tasks with varying number of classes without introducing any additional parameters.…”
Section: Related Workmentioning
confidence: 99%
“…In order to quickly adapt to any unseen task, the neural network should be able to extract temporal features at multiple time scales and should ensure that the fine-tuned network can generalize to time series of varying lengths across tasks. We, therefore, use filters of multiple lengths in each convolutional block to capture temporal features at various time scales, as found to be useful in [3,18,29].…”
Section: Neural Networkmentioning
confidence: 99%
“…Generally, the CNN was applied for better classification performances to various types of images. Recently, the CNN architecture was applied into the BMI fields to consider dynamics of the signal during the movement and to extract static energy feature robustly [32].…”
Section: E Data Analysismentioning
confidence: 99%
“…In this work, we propose ConvTimeNet (CTN), a deep CNNbased transfer learning approach for UTSC. CTN consists of multiple length 1-D convolutional filters in all convolutional layers (similar to that in InceptionNet [13], [14]) resulting in filters that can capture features at multiple time scales. The key contributions of this work can be summarized as follows:…”
Section: Introductionmentioning
confidence: 99%