2009
DOI: 10.1016/j.isprsjprs.2008.03.004
|View full text |Cite
|
Sign up to set email alerts
|

Feature reduction using a singular value decomposition for the iterative guided spectral class rejection hybrid classifier

Abstract: Abstract-Feature reduction in a remote sensing dataset is often desirable to decrease the processing time required to perform a classification and improve overall classification accuracy. This work introduces a feature reduction method based on the singular value decomposition (SVD). This feature reduction technique was applied to training data from two multitemporal datasets of Landsat TM/ETM+ imagery acquired over a forested area in Virginia, USA and Rondônia, Brazil. Subsequent parallel iterative guided spe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
19
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(19 citation statements)
references
References 25 publications
0
19
0
Order By: Relevance
“…The main purpose of attribute reduction is based on neighborhood rough set to eliminate redundant attributes classified data and extract useful information. Firms' business methods are changing from product-centric to (1) Singular value decomposition: This method has a good effect when the data dimension is high, and it is often used as a preprocessing method to realize the convergence of fuzzy rough reduction in high-dimensional data sets; but the calculation cost of this method is high [13,14]; (2) The principal component analysis is reduced, and the mutual influence between evaluation indexes is eliminated by replacing the original variables with several principal components with larger contributions. This study is necessary to delete irrelevant or unimportant attributes to eliminate the interference of irrelevant features when using the data with higher dimensions [15]; (3) Effective feature extraction in deep learning: Data-driven deep learning analysis has been developed and applied in many fields.…”
Section: Introductionmentioning
confidence: 99%
“…The main purpose of attribute reduction is based on neighborhood rough set to eliminate redundant attributes classified data and extract useful information. Firms' business methods are changing from product-centric to (1) Singular value decomposition: This method has a good effect when the data dimension is high, and it is often used as a preprocessing method to realize the convergence of fuzzy rough reduction in high-dimensional data sets; but the calculation cost of this method is high [13,14]; (2) The principal component analysis is reduced, and the mutual influence between evaluation indexes is eliminated by replacing the original variables with several principal components with larger contributions. This study is necessary to delete irrelevant or unimportant attributes to eliminate the interference of irrelevant features when using the data with higher dimensions [15]; (3) Effective feature extraction in deep learning: Data-driven deep learning analysis has been developed and applied in many fields.…”
Section: Introductionmentioning
confidence: 99%
“…Its applicability, though, in digital image processing, relies on the fact that multi-spectral remotely sensed spatial data are naturally redundant [33]. Applications include singal estimation in spectral data [34]; classification; data compression and noise reduction [35,36]; identification of spectral signatures [9]; and feature reduction [33].…”
Section: Image Processing Via Svdmentioning
confidence: 99%
“…To exemplify, [33] compared the performance of SVD to PCA as a feature reduction step on images containing fewer than 100 bands prior to forest/non-forest classifications. In their study, SVD outperformed PCA in terms of classification accuracy and computational time saving.…”
Section: Image Processing Via Svdmentioning
confidence: 99%
“… SVD is of interest in this research due to its known performance in tolerating data noise (Simek, 2003;Simek et al, 2004;Phillips, Watson, Wynne, & Blinn, 2009;Chakroborty & Saha, 2010). It is a factorization of a real matrix ∈ × , ≥ ,…”
Section: Experiments Iii-feature Reduction Techniques Comparisonmentioning
confidence: 99%