2022
DOI: 10.1109/tgrs.2022.3196057
|View full text |Cite
|
Sign up to set email alerts
|

Hyperspectral Unmixing Using Transformer Network

Abstract: Transformers have intrigued the vision research community with their state-of-the-art performance in natural language processing. With their superior performance, transformers have found their way in the field of hyperspectral image classification and achieved promising results. In this article, we harness the power of transformers to conquer the task of hyperspectral unmixing and propose a novel deep neural network-based unmixing model with transformers. A transformer network captures nonlocal feature depende… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 56 publications
(29 citation statements)
references
References 62 publications
0
14
0
Order By: Relevance
“…We evaluate the proposed DSET-Net compared with several classical unmixing algorithms, such as the fully constrained least squares(FCLS) [16], the untied denoising autoencoder with sparsity (uDAS) [17], the deep autoencoder network (DAEN) [18], the cycle-consistency unmixing network (CyCU-Net) [19], the hyperspectral unmixing using deep image prior (UnDIP) [20], and the DeepTrans-HSU [10]. The quantitative results are shown in Table 1.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…We evaluate the proposed DSET-Net compared with several classical unmixing algorithms, such as the fully constrained least squares(FCLS) [16], the untied denoising autoencoder with sparsity (uDAS) [17], the deep autoencoder network (DAEN) [18], the cycle-consistency unmixing network (CyCU-Net) [19], the hyperspectral unmixing using deep image prior (UnDIP) [20], and the DeepTrans-HSU [10]. The quantitative results are shown in Table 1.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…As a neural network mainly based on the self-attention mechanism, the transformer can better explore the potential relationships of different features, which improves the ability of the autoencoder to aggregate spatial and spectral correlations(SSCs) from pixels, thus improving abundance learning and the accuracy of unmixing [9]. However, the original transformer only uses the block sequences to obtain spectral information of pixels, ignoring the spatial correlation between pixels inside the blocks, causing negative effects to unmixing [10].…”
Section: Introductionmentioning
confidence: 99%
“…First. The MSA allows the free flow of information between tokens 45 , facilitating the exchange of information to focus on the interrelationships between plant, leaf disease and severity features 49 . It enhances the ability to identify plant disease characteristics of the multi-label identification model.…”
Section: Token Encoder Modulementioning
confidence: 99%
“…This is generally done by utilizing the encoder, while the decoder transforms the estimated fractional abundances into reconstructed hyperspectral images using linear layers, where the endmembers act as the weights. A variety of autoencoders has been applied in hyperspectral unmixing: Denoising autoencoders [4], Sparse nonnegative autoencoders [5], Variational autoencoders [6], [7], Convolutional autoencoders [8], [9], [10], [11], Adversarial autoencoders [12], [13], and Transformer autoencoders ( [14], [15]).…”
Section: Introductionmentioning
confidence: 99%