2016
DOI: 10.3390/rs8070601
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Supervised Ensemble Classifier for the Classification of Hyperspectral Data Using Few Labeled Samples

Abstract: Kernel-based methods and ensemble learning are two important paradigms for the classification of hyperspectral remote sensing images. However, they were developed in parallel with different principles. In this paper, we aim to combine the advantages of kernel and ensemble methods by proposing a kernel supervised ensemble classification method. In particular, the proposed method, namely RoF-KOPLS, combines the merits of ensemble feature learning (i.e., Rotation Forest (RoF)) and kernel supervised learning (i.e.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 16 publications
(6 citation statements)
references
References 50 publications
0
6
0
Order By: Relevance
“…Specifically, the assumptions of stationarity and ergodicity of the system either in terms of data statistics (e.g., [1], [16], [17], and [18] chap. 3) and processing approach (e.g., [2], [19]- [21], [18] chap. 16) are implied and/or drawn in order to maximize the trade off between precision and efficiency of the remote sensing data analysis frameworks.…”
Section: A System Modelmentioning
confidence: 99%
“…Specifically, the assumptions of stationarity and ergodicity of the system either in terms of data statistics (e.g., [1], [16], [17], and [18] chap. 3) and processing approach (e.g., [2], [19]- [21], [18] chap. 16) are implied and/or drawn in order to maximize the trade off between precision and efficiency of the remote sensing data analysis frameworks.…”
Section: A System Modelmentioning
confidence: 99%
“…Moreover, the combination of an arbitrary number of classifiers performed at each iteration makes AdaBoost typically robust to overfitting (even after a large number of iterations), as well as capable of reducing the generalization error. For these reasons, AdaBoost has been successfully employed in several research fields, from face recognition to biomedical imaging, from natural language processing to remote sensing data analysis [20], [23], [27].…”
Section: A Main Approaches In Model-level Ensemble Learningmentioning
confidence: 99%
“…In this way, we can provide a shared platform for the aforementioned schemes of dimensionality reduction so that consistent evaluation of the performance can be conducted. When RF is employed, the number of decision trees was empirically set to 100 decision trees, and √ F 2 variables are randomly drawn at each node of the trees, where F is the number of features associated with each pixel that has been identified by the aforementioned selection architectures [27]. The SVM setup is analogous to the one we used for our proposed approach.…”
Section: ) Ensemble Learning At Model Levelmentioning
confidence: 99%
“…Some examples are non-parametric weighted feature extraction [15], dimension reduction techniques [16], attraction points feature extraction [17], band clustering [18], etc. In addition, other researchers focus on designing more robust classifiers, such as classification methods based on sparse representation [19] and ensemble learning [20][21][22][23]. In later research, HIC-SS methods based on deep models gradually became mainstream.…”
Section: Introductionmentioning
confidence: 99%