1998
DOI: 10.1080/10106049809354645
|View full text |Cite
|
Sign up to set email alerts
|

Expanding the Narendra‐Fukunaga algorithm for selecting feature subset in classification problem

Abstract: An increasingly large amount of texture features has been developed and used in combination with spectral features for classification problem. The question is how to select the combination/subset of texture features with spectral features best suited to a particular classification problem. In this paper, a modification of NarendraFukunaga algorithm ( 1977) is developed to select a global best subset of features. The computational aspect of the algorithm is discussed. The modified algorithm ensures to select a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…The extended BB FSS algorithm (Duc & Andrianasolo, 1998), with the exception of the functionality of determining variables to be disregarded in the search process, was implemented in the Python programming language using the Mahalanobis distance as the subset evaluation metric. This algorithm, despite performing a complete FSS without exhausting every feature combination, may imply a high computational cost since both the number of evaluated subsets and dimension of the covariance matrix to be inverted during the Mahalanobis distance calculation grow exponentially as a function of the number of features (Webb, 2002).…”
Section: Methodsmentioning
confidence: 99%
“…The extended BB FSS algorithm (Duc & Andrianasolo, 1998), with the exception of the functionality of determining variables to be disregarded in the search process, was implemented in the Python programming language using the Mahalanobis distance as the subset evaluation metric. This algorithm, despite performing a complete FSS without exhausting every feature combination, may imply a high computational cost since both the number of evaluated subsets and dimension of the covariance matrix to be inverted during the Mahalanobis distance calculation grow exponentially as a function of the number of features (Webb, 2002).…”
Section: Methodsmentioning
confidence: 99%
“…Two FSS algorithms were tested: extended branch and bound algorithm (BB) and recursive feature elimination algorithm (RFE). The extended BB FSS algorithm [27], with the exception of the functionality of determining variables to be disregarded in the search process, was implemented in the Python programming language using the Mahalanobis distance as the subset evaluation metric. This algorithm, despite performing a complete FSS without exhausting every feature combination, may imply a high computational cost since both the number of evaluated subsets and dimension of the covariance matrix to be inverted during the Mahalanobis distance calculation grow exponentially as a function of the number of features [5].…”
Section: Feature Selectionmentioning
confidence: 99%