2020
DOI: 10.1177/1729881420938927
|View full text |Cite
|
Sign up to set email alerts
|

Nonlocal spatial attention module for image classification

Abstract: To enhance the capability of neural networks, research on attention mechanism have been deepened. In this area, attention modules make forward inference along channel dimension and spatial dimension sequentially, parallelly, or simultaneously. However, we have found that spatial attention modules mainly apply convolution layers to generate attention maps, which aggregate feature responses only based on local receptive fields. In this article, we take advantage of this finding to create a nonlocal spatial atten… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 42 publications
(162 reference statements)
0
9
0
Order By: Relevance
“…A. Network structure SAM [28] (Spatial Attention Module): After the maximum and average pooling of the input convolution block, the position of the entire feature area can be determined through the convolution operation, and the influence of noise and rotation on the image can be filtered out. The definition of SAM is as follows:…”
Section: The Proposed Methodsmentioning
confidence: 99%
“…A. Network structure SAM [28] (Spatial Attention Module): After the maximum and average pooling of the input convolution block, the position of the entire feature area can be determined through the convolution operation, and the influence of noise and rotation on the image can be filtered out. The definition of SAM is as follows:…”
Section: The Proposed Methodsmentioning
confidence: 99%
“…That is, the attention method uses obtained features of different parts of a network as weights to act other parts for learning more substantial sequential information. Current attention methods can be divided into two kinds: channel attention [17] and spatial attention methods [8]. Specifically, channel attention method emphasizes effects of channel features on the whole CNN.…”
Section: Attention Mechanisms For Image Classificationmentioning
confidence: 99%
“…Inspired by that, scholars combined an attention mechanism into a residual network [18]. That is divided into five kinds: single-path attention [18], multi-path attention [19], channel attention [17], spatial attention [8] and the combination of channel attention and spatial attention [20]. Specifically, a single-path attention method mainly uses its [17] enhanced the effects of different channels to improve the classification results.…”
Section: Attention-based Cnns For Image Classificationmentioning
confidence: 99%
See 2 more Smart Citations