2020
DOI: 10.1049/el.2020.1952
|View full text |Cite
|
Sign up to set email alerts
|

ABiFN: Attention‐based bi‐modal fusion network for object detection at night time

Abstract: Camera‐based object detection in low‐light/night‐time conditions is a fundamental problem because of insufficient lighting. So far, a mid‐level fusion of RGB and thermal images is done to complement each other's features. In this work, an attention‐based bi‐modal fusion network is proposed for a better object detection in the thermal domain by integrating a channel‐wise attention module. The experimental results show that the proposed framework improves the mAP by 4.13 points on the FLIR dataset.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(11 citation statements)
references
References 23 publications
0
11
0
Order By: Relevance
“…Nevertheless, Cyclic Fuse-and-Refine Network (CFR_3) could reduce the miss rate on the KAIST dataset and improve the mAP on the FLIR benchmark to 72.39%. In November 2020 A. Sai Charan et al attempted to fuse IR and RGB data by giving weight to each sensor using attention mechanism [12]. The attention module is placed in every residual block [13] after the RELU activation function and first batch normalization.…”
Section: Related Workmentioning
confidence: 99%
“…Nevertheless, Cyclic Fuse-and-Refine Network (CFR_3) could reduce the miss rate on the KAIST dataset and improve the mAP on the FLIR benchmark to 72.39%. In November 2020 A. Sai Charan et al attempted to fuse IR and RGB data by giving weight to each sensor using attention mechanism [12]. The attention module is placed in every residual block [13] after the RELU activation function and first batch normalization.…”
Section: Related Workmentioning
confidence: 99%
“…Zhen et al. design a novel flame detection method with a candidate target area extraction strategy which reaches higher recall rate on the homemade database [10]. An improved faster RCNN fire detection approach based on colour features and image global information is proposed in [13].…”
Section: Introductionmentioning
confidence: 99%
“…With the development of deep learning [2–6], frameworks applying deep convolutional neural networks (CNNs) can detect fires more accurately and efficiently [7–9, 10–13]. Muhammad et al.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations