2020
DOI: 10.3390/rs12071179
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised and Supervised Feature Extraction Methods for Hyperspectral Images Based on Mixtures of Factor Analyzers

Abstract: This paper proposes three feature extraction (FE) methods based on density estimation for hyperspectral images (HSIs). The methods are a mixture of factor analyzers (MFA), deep MFA (DMFA), and supervised MFA (SMFA). The MFA extends the Gaussian mixture model to allow a low-dimensionality representation of the Gaussians. DMFA is a deep version of MFA and consists of a two-layer MFA, i.e, samples from the posterior distribution at the first layer are input to an MFA model at the second layer. SMFA consists of si… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(5 citation statements)
references
References 54 publications
0
5
0
Order By: Relevance
“…(3) graph-based methods [21][22][23]; e.g., Ding et al [23] proposed a semi-supervised locality-preserving dense graph neural network (GNN) for HSI classification in which autoregressive moving average filters and context-aware learning are integrated. Moreover, the self-supervised learning methods are also applied to the few-shot HSI classification [24][25][26][27]. In [26], a self-supervised contrastive fruitful disymmetrical expanded network is presented for HSI classification.…”
Section: Introductionmentioning
confidence: 99%
“…(3) graph-based methods [21][22][23]; e.g., Ding et al [23] proposed a semi-supervised locality-preserving dense graph neural network (GNN) for HSI classification in which autoregressive moving average filters and context-aware learning are integrated. Moreover, the self-supervised learning methods are also applied to the few-shot HSI classification [24][25][26][27]. In [26], a self-supervised contrastive fruitful disymmetrical expanded network is presented for HSI classification.…”
Section: Introductionmentioning
confidence: 99%
“…Some of the previous classification studies (eg. 24 ) has cleaned the Indian Pines data, by limiting the classes to the largest ones, since there are classes that has only limited amount of samples (see from table 2). It would have been interesting to try this experiment again with Indian Pines data and limited classes.…”
Section: Discussionmentioning
confidence: 99%
“…With those methods, we can reduce the number of the training samples (R size) and reduce the computing time, without decreasing the accuracy rate. The FE is an important step before using the HS image classification methods 10,26 .…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Even on the 0th day, the negative values of LDA component 1 appeared to originate from the fat tissues whose uorescence was similar to that of NADH. The decision or evaluation boundaries of F. I. produced by QDA are shown in Fig.5bas contour plots 22. Information on F. I., averaged over each hyperspectral image by line-scan-type HIS as reference data, and hyperspectral images taken by snapshot-type HIS were obtained for identical meat specimen as a function of storage time.…”
mentioning
confidence: 99%