2020
DOI: 10.1007/978-3-030-58577-8_6
|View full text |Cite
|
Sign up to set email alerts
|

Weight Excitation: Built-in Attention Mechanisms in Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 23 publications
(18 citation statements)
references
References 17 publications
0
18
0
Order By: Relevance
“…In the WEB, a sub-block for location-based weight excitation (LWE) proposed in [31] is used. The LWE provides fine-grained weight-wise attention during back propagation.…”
Section: Position Attention-aware Weight Excitation (Pawe)mentioning
confidence: 99%
“…In the WEB, a sub-block for location-based weight excitation (LWE) proposed in [31] is used. The LWE provides fine-grained weight-wise attention during back propagation.…”
Section: Position Attention-aware Weight Excitation (Pawe)mentioning
confidence: 99%
“…The proposed WEU-Net is a encoder-decoder model shown in Figure 1 that modified version of the popular U-Net [5] model for the biomedical image segmentation task. The encoder part of the WEU-Net is composed by weight excitation based CNN (WE-CNN) [6], ReLU activation and MaxPooling layers. The input size of the model is 256x256x1, because of the single channel CT lung nodule image.…”
Section: Proposed Modelmentioning
confidence: 99%
“…Hence, it is necessary to develop advanced architectures that can deal with the weaknesses of the previous architectures. In this paper, a weight excitation (WE) mechanism through the weight reparameterization-based backpropagation corrections is used from [6] and implemented with the U-Net encoder and decoder architecture to cope with the heterogeneity of lung nodule feature extraction efficiently, which is suitable for the segmentation of various forms of lung nodules.The main contributions of this research can be summarized as follows:…”
Section: Introductionmentioning
confidence: 99%
“…After years of development, deep learning methods have achieved remarkable success on visual classification tasks [17,39,21,32]. The outstanding performance, however, heavily relies on large-scale labeled datasets [5].…”
Section: Introductionmentioning
confidence: 99%