2022
DOI: 10.1117/1.jrs.16.046506
|View full text |Cite
|
Sign up to set email alerts
|

Context-aware cross-level attention fusion network for infrared small target detection

Abstract: .Infrared small target detection (IRSTD) plays an essential role in many fields such as air guidance, tracking, and surveillance. However, due to the tiny sizes of infrared small targets, which are easily confused with background noises and lack clear contours and texture information, how to learn more discriminative small target features while suppressing background noises is still a challenging task. In this paper, a context-aware cross-level attention fusion network for IRSTD is proposed. Specifically, a se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 49 publications
0
1
0
Order By: Relevance
“…The deep learning methods for infrared detection can greatly elevate the detection rate of algorithm [18,19]. In order to further improve the detection rate and enhance the applicability of the detection algorithm, some scholars have proposed fusion network methods for infrared images detection [20][21][22]. Zhou et al [23] proposed an approach for improving object detection by designing an adaptive feature extraction module within the backbone network.…”
Section: Introductionmentioning
confidence: 99%
“…The deep learning methods for infrared detection can greatly elevate the detection rate of algorithm [18,19]. In order to further improve the detection rate and enhance the applicability of the detection algorithm, some scholars have proposed fusion network methods for infrared images detection [20][21][22]. Zhou et al [23] proposed an approach for improving object detection by designing an adaptive feature extraction module within the backbone network.…”
Section: Introductionmentioning
confidence: 99%