2021
DOI: 10.1016/j.inffus.2021.02.023
|View full text |Cite
|
Sign up to set email alerts
|

RFN-Nest: An end-to-end residual fusion network for infrared and visible images

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
218
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 555 publications
(218 citation statements)
references
References 69 publications
0
218
0
Order By: Relevance
“…In order to test the practicability and effectiveness of the proposed method, we set up two groups of experiments. The first group compares the proposed method with RE-NSST and IFS-NSST methods, and the second group compares the proposed method with the other nine advanced fusion methods which are FPDE [ 35 ] (fourth-order partial differential equations), VSM [ 36 ] (visual saliency map), Bala [ 37 ] (Bala fuzzy sets), Gauss [ 34 ] (Gauss fuzzy sets), DRTV [ 38 ] (Different Resolutions via Total Variation Model), LATLRR [ 39 ] (latent Low-Rank Representation), SR [ 40 ] (sparse regularization), MDLatLRR [ 41 ] (decomposition method based on latent low-rank representation) and RFN-Nest [ 42 ] (residual fusion network).…”
Section: Resultsmentioning
confidence: 99%
“…In order to test the practicability and effectiveness of the proposed method, we set up two groups of experiments. The first group compares the proposed method with RE-NSST and IFS-NSST methods, and the second group compares the proposed method with the other nine advanced fusion methods which are FPDE [ 35 ] (fourth-order partial differential equations), VSM [ 36 ] (visual saliency map), Bala [ 37 ] (Bala fuzzy sets), Gauss [ 34 ] (Gauss fuzzy sets), DRTV [ 38 ] (Different Resolutions via Total Variation Model), LATLRR [ 39 ] (latent Low-Rank Representation), SR [ 40 ] (sparse regularization), MDLatLRR [ 41 ] (decomposition method based on latent low-rank representation) and RFN-Nest [ 42 ] (residual fusion network).…”
Section: Resultsmentioning
confidence: 99%
“…Our method is based on the PyTorch framework and runs on a NVIDIA GTX1070 GPU. We evaluated the effectiveness and efficiency of our scheme by comparing with a number of recent CNN-based methods (i.e., PMGI [50], FGAN [37], DDcGAN [36], FusionDN [46], U2Fusion [45], RFN [19], GANMcC [38], Nest [16], DID [51] and Dense [15]).…”
Section: Training Detailsmentioning
confidence: 99%
“…Thus, in addition to fusion quality metrics, we added a perceptual quality metric to show the performance of our method more accurately. Furthermore, we conducted an experiment on TNO [13] dataset using DLF [2], DeepFuse [3], Dual Branch [6] and Rfn-Nest [5] state-of-the-art deep learning methods and Hybrid MSD [16] method. In Table 1, average scores of the six methods, including the proposed method, are given and it is seen that our proposed method obtain the best results.…”
Section: Quantitative Performance Comparisonsmentioning
confidence: 99%
“…This method is used to solve semantic segmentation problems in multi-spectral image pairs. The latest approaches Rfn-Nest [5] and Dual-Branch [6] also use the power of the encoder-decoder network. They directly reconstruct the fused image from source image pairs.…”
Section: Introductionmentioning
confidence: 99%