2022
DOI: 10.48550/arxiv.2207.07858
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The Lottery Ticket Hypothesis for Self-attention in Convolutional Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…The CNN was employed to extract spatial information from image frames and the BiLSTM module was applied to capture temporal dependencies between features extracted from different frames. Inspired by the ability of attention mechanism to enhance the model generalization [10][11][12][13] , a temporal attention module with Bahdanau's additive style 14 was added to enhance the temporal learning of the calcium dynamics.…”
Section: Introductionmentioning
confidence: 99%
“…The CNN was employed to extract spatial information from image frames and the BiLSTM module was applied to capture temporal dependencies between features extracted from different frames. Inspired by the ability of attention mechanism to enhance the model generalization [10][11][12][13] , a temporal attention module with Bahdanau's additive style 14 was added to enhance the temporal learning of the calcium dynamics.…”
Section: Introductionmentioning
confidence: 99%
“…Neural Networks [12], Convolutional Neural Networks [13], Graph Convolution Neural Networks, and Attention Convolutional Neural Networks [14]), our method significantly reduced the computation time and achieved a precision of about 80%. Importantly, we provide an algorithm that balanced computation time and precisions, so that we can further explore the biological significance represented by the cavities.…”
Section: Introductionmentioning
confidence: 99%