2019
DOI: 10.1080/01431161.2019.1577580
|View full text |Cite
|
Sign up to set email alerts
|

Dictionary-based classifiers for exploiting feature sequence information and their application to hyperspectral remotely sensed data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(10 citation statements)
references
References 38 publications
0
10
0
Order By: Relevance
“…Then, the performance of FWAdaBoost-SOFIES is compared with a number of classification approaches on the nine multi-class datasets for benchmark comparison. The following nine widely used classification approaches are used for comparison (the first three are zero-order FISs): 1) selforganizing fuzzy inference ensemble system (SOFEnsemble) [53]; 2) zero-order autonomous learning multiple-model classifier (ALMMo0) [54]; 3) eClass0 classifier [39]; 4) SVM with linear kernel; 5) KNN; 6) DT; 7) multilayer perceptron (MLP); 8) sequence classifier (SC) [58], and; 9) extreme learning machine (ELM) [59].…”
Section: Performance Demonstrationmentioning
confidence: 99%
See 1 more Smart Citation
“…Then, the performance of FWAdaBoost-SOFIES is compared with a number of classification approaches on the nine multi-class datasets for benchmark comparison. The following nine widely used classification approaches are used for comparison (the first three are zero-order FISs): 1) selforganizing fuzzy inference ensemble system (SOFEnsemble) [53]; 2) zero-order autonomous learning multiple-model classifier (ALMMo0) [54]; 3) eClass0 classifier [39]; 4) SVM with linear kernel; 5) KNN; 6) DT; 7) multilayer perceptron (MLP); 8) sequence classifier (SC) [58], and; 9) extreme learning machine (ELM) [59].…”
Section: Performance Demonstrationmentioning
confidence: 99%
“…In this example, SOFEnsemble, ALMMo0, eClass0 and SC classifiers follow the recommended parameter settings given by [39], [53], [54], [58]. For SVM, KNN, MLP, DT and ELM classifiers, five different parameter settings are considered for each of them during the experiments and the best performances are reported.…”
Section: Performance Demonstrationmentioning
confidence: 99%
“…Various modifications of such tree-based methods are presented in [35,36]. Dictionary-based classifiers [37] are also more prevalent in remote sensing but are mostly dependent on the rules to build the dictionary and may not be applicable for all applications. Further, if multiple trees are incorporated, they can be treated as forestbased approaches.…”
Section: Classifier Selectionmentioning
confidence: 99%
“…Instead of direct distances, a feature reconstruction error driven through regularization is also helpful to nonlinearly learn the boundaries. The nearest regularized subspace (NRS) classifier is such an approach [41] and has been investigated in the literature [37] for hyperspectral applications. For highly unbalanced data, the synthetic minority oversampling technique (SMOTE) [42,43] has been verified for classification.…”
Section: Classifier Selectionmentioning
confidence: 99%
“…Several approaches of BS can be categorized into supervised (Patro, Subudhi, et al, 2019a) (Yang, Du, Su, & Sheng, 2010) semi-supervised as well as unsupervised (Patra et al 2015) techniques. Again, depending on the type of information used, the process can be spectral (Patro, Subudhi, et al, 2019b) or spatial. In the present literature, a clustering and ranking-centric unsupervised BS is proposed and similar articles are discussed further.…”
Section: Introductionmentioning
confidence: 99%