2023
DOI: 10.3390/rs15194729
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Scale Feature Fusion Based on PVTv2 for Deep Hash Remote Sensing Image Retrieval

Famao Ye,
Kunlin Wu,
Rengao Zhang
et al.

Abstract: For high-resolution remote sensing image retrieval tasks, single-scale features cannot fully express the complexity of the image information. Due to the large volume of remote sensing images, retrieval requires extensive memory and time. Hence, the problem of how to organically fuse multi-scale features and enhance retrieval efficiency is yet to be resolved. We propose an end-to-end deep hash remote sensing image retrieval model (PVTA_MSF) by fusing multi-scale features based on the Pyramid Vision Transformer … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 56 publications
0
1
0
Order By: Relevance
“…Considering the potential cumulative error caused via the separate estimation of atmospheric light and the transmission map, IFE-Net unifies atmospheric light and transmission maps as one parameter to directly obtain a clean image. In addition, attention mechanisms have been widely applied in the design of neural networks [19,[33][34][35][36], which can provide additional flexibility in the network. Inspired by these works and considering the different weights of features in different regions, a feature attention mechanism module called attention mechanism (AM) is designed in the network, which processes different types of information more effectively.…”
Section: Introductionmentioning
confidence: 99%
“…Considering the potential cumulative error caused via the separate estimation of atmospheric light and the transmission map, IFE-Net unifies atmospheric light and transmission maps as one parameter to directly obtain a clean image. In addition, attention mechanisms have been widely applied in the design of neural networks [19,[33][34][35][36], which can provide additional flexibility in the network. Inspired by these works and considering the different weights of features in different regions, a feature attention mechanism module called attention mechanism (AM) is designed in the network, which processes different types of information more effectively.…”
Section: Introductionmentioning
confidence: 99%