2019
DOI: 10.3390/rs11161954
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Hyperspectral Image Classification Pattern Using Random Patches Convolution and Local Covariance

Abstract: Today, more and more deep learning frameworks are being applied to hyperspectral image classification tasks and have achieved great results. However, such approaches are still hampered by long training times. Traditional spectral–spatial hyperspectral image classification only utilizes spectral features at the pixel level, without considering the correlation between local spectral signatures. Our article has tested a novel hyperspectral image classification pattern, using random-patches convolution and local c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 54 publications
0
5
0
Order By: Relevance
“…While adjusting a parameter to an optimum value in a specified range, the remaining parameters are kept constant. In order to validate the effectiveness of the GRR, we select some recent methods called JCRC-MTL [34], RPNet [38], RPCC [39], SSRPNet [41], GRPC [43], RPNet-RF [45] and MS-RPNet [46] for the comparison. In addition, an SVM-based version of GRR (GR-SVM) which classifies the features with SVM instead of RCR is also taken into account to prove the effectiveness of the GRR.…”
Section: Experimental Analysis and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…While adjusting a parameter to an optimum value in a specified range, the remaining parameters are kept constant. In order to validate the effectiveness of the GRR, we select some recent methods called JCRC-MTL [34], RPNet [38], RPCC [39], SSRPNet [41], GRPC [43], RPNet-RF [45] and MS-RPNet [46] for the comparison. In addition, an SVM-based version of GRR (GR-SVM) which classifies the features with SVM instead of RCR is also taken into account to prove the effectiveness of the GRR.…”
Section: Experimental Analysis and Resultsmentioning
confidence: 99%
“…Over the years, different RPNet-based HSIC methods have been proposed. In random patches convolution and local covariance (RPCC) [39], the dimension of HSI was reduced by maximum noise fraction (MNF) [40]. Then RPNet was used to extract spatial features and local covariance matrices were utilized to extract spectral features.…”
Section: Introductionmentioning
confidence: 99%
“…MNF was applied for dimensionality reduction in this study. MNF segregates noise from bands through modified principal-component analysis (PCA) by ranking images on the basis of signal-to-noise ratio (SNR) [39,49]. MNF defines the noise of each band as follows:…”
Section: Frequency Transformation-minimum Noise Fraction (Mnf)mentioning
confidence: 99%
“…Many state-of-the-art methods have been developed to solve the environmental noise and dimensionality problem of HSIs, such as statistical filters [27], feature-extraction algorithms [28][29][30][31], discrete Fourier transforms and wavelet estimation [32,33], rotation forests [34], morphological segmentation [35,36], support vector machine (SVM) [37], minimum noise fractions (MNFs) [38,39], and empirical mode decomposition (EMD). EMD is a one-dimensional signal-decomposition method that can decompose an input signal into several hierarchical components known as intrinsic mode functions (IMFs) and a residue signal [40][41][42][43].…”
Section: Introductionmentioning
confidence: 99%
“…Unlike in [38], a random batch classification method [39] used a random strategy to obtain the convolution kernel in the original HSI without training, which can greatly save running time. To utilize the correlation between local spectral features, Sun et al [40] proposed a random batch and local covariance classification framework that combined the covariance matrix with RPNet on the basis of [39], thereby greatly improving the classification accuracy.…”
Section: Introductionmentioning
confidence: 99%