2015
DOI: 10.1016/j.neucom.2013.09.070
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble of extreme learning machine for remote sensing image classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 73 publications
(28 citation statements)
references
References 18 publications
0
28
0
Order By: Relevance
“…• Select disjoint train and test datasets The V-ELM has already been tested in a number of applications, such as hyperspectral image classification (Ayerdi, Marqués, & Graña, 2015), remote sensing data classification (Han & Liu, 2015) and natural gas reservoir characterization (Anifowose, Labadin, & Abdulraheem, in press), wastewater quality index modeling (Zhao, Yuan, Chai, & Tang, 2011), and intrusion detection (Fossaceca, Mazzuchi, & Sarkani, 2015) with the enhancement of multikernel learning. This basic architecture has been modified in the literature, for instance soft-class dependent voting schemes (Cao et al, 2015) provide improved reliability and sparseness of the model, a distributed approach allows to perform classification in P2P networks (Sun, Yuan, & Wang, 2011), and delta test strategy for hidden units selection enhances the construction of ensembles in Yu et al (2014).…”
Section: Algorithm 1 Crossvalidation Scheme For Training the Elm Ensementioning
confidence: 99%
“…• Select disjoint train and test datasets The V-ELM has already been tested in a number of applications, such as hyperspectral image classification (Ayerdi, Marqués, & Graña, 2015), remote sensing data classification (Han & Liu, 2015) and natural gas reservoir characterization (Anifowose, Labadin, & Abdulraheem, in press), wastewater quality index modeling (Zhao, Yuan, Chai, & Tang, 2011), and intrusion detection (Fossaceca, Mazzuchi, & Sarkani, 2015) with the enhancement of multikernel learning. This basic architecture has been modified in the literature, for instance soft-class dependent voting schemes (Cao et al, 2015) provide improved reliability and sparseness of the model, a distributed approach allows to perform classification in P2P networks (Sun, Yuan, & Wang, 2011), and delta test strategy for hidden units selection enhances the construction of ensembles in Yu et al (2014).…”
Section: Algorithm 1 Crossvalidation Scheme For Training the Elm Ensementioning
confidence: 99%
“…The algorithm takes as an input a training set Γ, an ensemble of classifiers Ω, and a selection threshold σ. It begins with computing the ensemble members' predictions P reds of the training samples [lines [3][4][5][6][7], and uses them to build the adjacency matrix w [lines [8][9][10][11][12][13]. Then, it estimates the individual contribution of every classifier using the definition provided by Equation (8) For each h i ∈ Ω…”
Section: Isgep Algorithmmentioning
confidence: 99%
“…Min Han et al [26] found the good classification method which could achieve good classification accuracy to deal with remote sensing image classification. This author proposes a new classification method extreme learning machine (ELM).…”
Section: Related Workmentioning
confidence: 99%