2016
DOI: 10.1109/lgrs.2016.2517178
|View full text |Cite
|
Sign up to set email alerts
|

Classification of Hyperspectral Remote Sensing Image Using Hierarchical Local-Receptive-Field-Based Extreme Learning Machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(21 citation statements)
references
References 19 publications
0
21
0
Order By: Relevance
“…Extensive experiments indicate the effectiveness of the proposed method, which achieves comparable or better accuracy performance than existing methods, such as deep neural networks [28], multiple kernel learning [29], probabilistic class structure regularized sparse representation graph [30,31] and low-rank Gabor filtering [12] (see the results in Table 10).…”
Section: Introductionmentioning
confidence: 90%
See 2 more Smart Citations
“…Extensive experiments indicate the effectiveness of the proposed method, which achieves comparable or better accuracy performance than existing methods, such as deep neural networks [28], multiple kernel learning [29], probabilistic class structure regularized sparse representation graph [30,31] and low-rank Gabor filtering [12] (see the results in Table 10).…”
Section: Introductionmentioning
confidence: 90%
“…Moreover, some CNN-based methods use preprocessing, often PCA, to either build a low dimensional set of non-linear input features or to extract additional information (e.g., edge detection). These methods include [20], a deep CNN with 2D input patches and R-PCA [33], a deep stacked auto-encoder with 2D input patches and PCA [18], a contextual deep CNN [36], a multi-hypothesis prediction [12], a low-rank Gabor filtering method [19], a deep CNN with 1D pixel spectra [23], a deep CNN with 1D pixel spectra, 2D pixel patches or 3D pixel cubes [21], a deep CNN with 1D pixel spectra and [28] a deep CNN with uniform smoothing kernel and 1D pixel spectra. Fortunately, the authors of the latter method shared the source code with us, which we could then use in our comparative experimental analysis.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Extreme Learning Machine( ELM) was originally proposed for generalized single-hidden layer feedforward neural networks [25][26][27]. Due to its fast learning speed, good generalization ability, and ease of implementation, ELM has become a popular research topic and been widely used for supervised learning of HSI [28][29][30][31][32]. Although ELM is successfully used for supervised learning in hyperspectral image classification, it is also difficult to avoid a large number of labeled samples.…”
Section: Introductionmentioning
confidence: 99%
“…To overcome this difficulty, ELM-LRF was proposed recently by combining the network structure of CNN with the learning strategy of ELM. The efficiency of ELM-LRF has been proved empirically in a few applications and generalizations [20], [21], [22], [23], [24].…”
Section: Introductionmentioning
confidence: 99%