2014
DOI: 10.1109/jstars.2013.2262926
|View full text |Cite
|
Sign up to set email alerts
|

A Kernel-Based Feature Selection Method for SVM With RBF Kernel for Hyperspectral Image Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
79
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 326 publications
(79 citation statements)
references
References 53 publications
0
79
0
Order By: Relevance
“…Another discriminant-based supervised hyperspectral feature extraction method, called non-parametric weighted feature extraction (NWFE) (Kuo and Landgrebe 2004), uses an improved discrimination criterion by assigning a higher weight to samples closer to the discrimination boundary region. Kuo et al (2014) proposed a kernel-based hyperspectral feature selection method, which optimizes the linear combination of z-score values of features in the radial basis function kernel. In the work of Yang et al (2014), an approach inspired by compressive sensing is given which is based on a single-layer feed-forward neural network with sparsity constraints on the input and hidden layer.…”
Section: Supervised Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Another discriminant-based supervised hyperspectral feature extraction method, called non-parametric weighted feature extraction (NWFE) (Kuo and Landgrebe 2004), uses an improved discrimination criterion by assigning a higher weight to samples closer to the discrimination boundary region. Kuo et al (2014) proposed a kernel-based hyperspectral feature selection method, which optimizes the linear combination of z-score values of features in the radial basis function kernel. In the work of Yang et al (2014), an approach inspired by compressive sensing is given which is based on a single-layer feed-forward neural network with sparsity constraints on the input and hidden layer.…”
Section: Supervised Methodsmentioning
confidence: 99%
“…Various feature extraction methods for the purpose of HSI classification have been proposed in the literature, including unsupervised methods (Hotelling 1933;Chang et al 1999;Wang and Chang 2006;Kaewpijit, Le-Moigne, and El-Ghazawi 2003;Tenenbaum, de Silva, and Langford 2000;Roweis and Saul 2000;Belkin and Niyogi 2001;Zhang and Zha 2005;He et al 2005;He and Niyogi 2004;Zhang et al 2007;Qiao, Chen, and Tan 2010), semisupervised (Cai, He, and Han 2007;Chen and Zhang 2011;Sugiyama et al 2010;Liao et al 2013;Shao and Zhang 2014), and supervised methods (Bandos, Bruzzone, and Camps-Valls. 2009;Li and Qian 2011;Kuo and Landgrebe 2004;Kuo et al 2014;Yang et al 2014;Zhong, Lin, and Zhang 2014;Tuia et al 2014;Tao et al 2013;Sugiyama 2007;Chen et al 2014;Chen, Zhao, and Jia 2015;Sun et al 2014;Castrodad et al 2011). …”
Section: Related Workmentioning
confidence: 99%
“…SVM is a machine learning algorithm based on statistical analysis [21]. It has great advantages in solving nonlinear, small sample and high-dimensional pattern recognition based on the principle of minimizing structural risk [22].…”
Section: Prediction Principle Of Avc-svmmentioning
confidence: 99%
“…It has great advantages in solving nonlinear, small sample and high-dimensional pattern recognition based on the principle of minimizing structural risk [22]. In addition, SVM algorithm also has many applications in bioinformatics [4, 21, 22]. In this paper, the SVM algorithm was used to predict ion channel types of the conotoxins.…”
Section: Prediction Principle Of Avc-svmmentioning
confidence: 99%
“…Supervised BS methods require some priori knowledge such as training samples or target signatures, for instance,91011121314. However, these training samples are sometimes not available in practice since the acquisition of reliable samples is very expensive in terms of both time and money1516.…”
mentioning
confidence: 99%