2020
DOI: 10.1109/tgrs.2020.2977819
|View full text |Cite
|
Sign up to set email alerts
|

Hyperspectral Unmixing Using Orthogonal Sparse Prior-Based Autoencoder With Hyper-Laplacian Loss and Data-Driven Outlier Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(17 citation statements)
references
References 35 publications
0
17
0
Order By: Relevance
“…We will come back to this question when discussing the experimental results. The methods in [41,73,[76][77][78][79][80] employ deep encoders. The method in [81] employs a deep 1D CNN encoder.…”
Section: A Deep Versus Shallow Encodermentioning
confidence: 99%
See 3 more Smart Citations
“…We will come back to this question when discussing the experimental results. The methods in [41,73,[76][77][78][79][80] employ deep encoders. The method in [81] employs a deep 1D CNN encoder.…”
Section: A Deep Versus Shallow Encodermentioning
confidence: 99%
“…In unmixing, this can cause the abundance maps of certain endmembers to become zerovalued. The works [71,79,80,83,84] all use the ReLU activation. The LReLU activation is a good choice of an activation function as it is non saturating in both directions.…”
Section: B Choice Of Activation Function For Hidden Layersmentioning
confidence: 99%
See 2 more Smart Citations
“…al [24] further introduced the Wasserstein distance as a regularization term to better consider the distribution similarity between the observation and the reconstruction. Under the observation that the abundance maps of different endmembers are nearly orthogonal, Dou et al embedded orthogonal sparse prior into the AE to better consider the relationship among abundance maps [25]. Qu et al introduced an untied denoising AE (uDAS) which decouples the decoder from the encoder weights for more accurate endmember extraction [26].…”
Section: Introductionmentioning
confidence: 99%