2018
DOI: 10.1109/jstars.2018.2872969
|View full text |Cite
|
Sign up to set email alerts
|

KNN-Based Representation of Superpixels for Hyperspectral Image Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
45
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 77 publications
(46 citation statements)
references
References 70 publications
0
45
0
1
Order By: Relevance
“…In Figures 7 and 8, the NPV and MCC the proposed random forest classifier values are near to 1 and showed higher performance compared to ANN [29], KNN [30] and decision tree [32]. In FPR, FNR, FRR and error rate, the proposed random In Figure 9(a), the performance measurements like specificity, sensitivity and accuracy is compared to random forest classifier, KNN, ANN and decision tree.…”
Section: Performance Analysismentioning
confidence: 99%
“…In Figures 7 and 8, the NPV and MCC the proposed random forest classifier values are near to 1 and showed higher performance compared to ANN [29], KNN [30] and decision tree [32]. In FPR, FNR, FRR and error rate, the proposed random In Figure 9(a), the performance measurements like specificity, sensitivity and accuracy is compared to random forest classifier, KNN, ANN and decision tree.…”
Section: Performance Analysismentioning
confidence: 99%
“…Sellars et al used a combination of Gaussian kernel technique, Log-Euclidean distance of a covariance matrix and Euclidean spectral distance to construct a weight between two connected superpixels [41]. By using a domain transform recursive filtering and k nearest neighbor rule (KNN), Tu et al [47] gave a representation of the distance between superpixels. These work make an effective attempt for HSI classification at superpixel level.…”
Section: Superpixel-to-superpixel Similaritymentioning
confidence: 99%
“…Compared with the similarity between a pair of superpixels introduced in [40] (an affine hull model and the singular value decomposition), the similarity suggested above is easy to understand since only the sorting rule is used. Contrasted to similarity defined in [41,47], our method is simple to calculate, and has no use of parameters. These advantages of the proposal make it easier to be applied in the field of remote sensing.…”
Section: Superpixel-to-superpixel Similaritymentioning
confidence: 99%
“…One is based on spectral information, while the other is based on spectral-spatial information jointly. Traditional machine learning classification algorithms for hyperspectral images, such as k-nearest neighbor [7], spectral angle mapping [8], and multinomial logistic regression [9] are basically the former one. In order to improve classification performance, later researches used support vector machine [10], principal component analysis (PCA) [11], and independent component analysis [12] to decrease the redundant informative features.…”
Section: Introductionmentioning
confidence: 99%