2022
DOI: 10.1007/s11042-022-12219-1
|View full text |Cite
|
Sign up to set email alerts
|

ADM-Net: attentional-deconvolution module-based net for noise-coupled traffic sign recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 61 publications
0
3
0
Order By: Relevance
“…However, using this method for recognition in harsh environments requires large memory relatively. Chung proposed an attentional deconvolution module (ADM) based network (ADM-Net) [ 33 ]. The network uses ADM, convolutional pools and full convolutional networks to improve classification under such harsh conditions.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…However, using this method for recognition in harsh environments requires large memory relatively. Chung proposed an attentional deconvolution module (ADM) based network (ADM-Net) [ 33 ]. The network uses ADM, convolutional pools and full convolutional networks to improve classification under such harsh conditions.…”
Section: Related Workmentioning
confidence: 99%
“…For example, they are sensitive to environmental factors such as light, noise and so on [ 2 ]. Besides, different kinds of traffic signs need different feature extractors [ 3 , 4 ]. Deep neural networks (DNN) have achieved great success in the field of image recognition with the development of deep learning technology.…”
Section: Introductionmentioning
confidence: 99%
“…Currently, the receptive field size for a large object is not sufficient for traffic signs because traffic signs are much smaller than other objects. Finally, we adopt the ADM-Net [44] as a classifier in the detection algorithm. The ADM-Net comprises the attentional-deconvolution module (ADM) within the network to maximize the feature information with the attention mechanism.…”
Section: Introductionmentioning
confidence: 99%