IGARSS 2022 - 2022 IEEE International Geoscience and Remote Sensing Symposium 2022
DOI: 10.1109/igarss46834.2022.9884280
|View full text |Cite
|
Sign up to set email alerts
|

Feature Pyramid and Global Attention Network Approach to Insar Two-Dimensional Phase Unwrapping

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…The feature maps obtained by adding attention and the global average pooling branch are added to form the ultimate output, further improving the performance of the FPA module. The GAU [19] module introduces a mechanism to assign weights to another feature based on the input feature, enabling the network to focus on crucial information. The network structure of the GAU module, depicted in Figure 2, involves several key steps.…”
Section: Proposed Network Model Structuresmentioning
confidence: 99%
See 1 more Smart Citation
“…The feature maps obtained by adding attention and the global average pooling branch are added to form the ultimate output, further improving the performance of the FPA module. The GAU [19] module introduces a mechanism to assign weights to another feature based on the input feature, enabling the network to focus on crucial information. The network structure of the GAU module, depicted in Figure 2, involves several key steps.…”
Section: Proposed Network Model Structuresmentioning
confidence: 99%
“…The residual connections of ResNet help alleviate the gradient vanishing issue, speeding up the training process and enabling the network to learn complex features more effectively with increased depth. Moreover, the incorporation of GAU and FPA helps mitigate the feature information loss in module connections [19][20][21][22]. GAU guides the network to focus on global information, while FPA assists the network with capturing features at different scales, enhancing the robustness of phase unwrapping.…”
Section: Introductionmentioning
confidence: 99%
“…The network has fewer layers, high accuracy, and fast execution speed. Chen Xiaomao proposed a lightweight network that combines two attention mechanisms [20]. This method is characterized by its time efficiency.…”
Section: Introductionmentioning
confidence: 99%