2014
DOI: 10.1007/978-3-642-55192-5_5
|View full text |Cite
|
Sign up to set email alerts
|

Redundant Feature Selection for Telemetry Data

Abstract: Abstract. Feature sets in many domains often contain many irrelevant and redundant features, both of which have a negative effect on the performance and complexity of agents that use the data [8]. Supervised feature selection aims to overcome this problem by selecting features that are highly related to the class labels, yet unrelated to each other. One proposed technique to select good features with few inter-dependencies is minimal Redundancy Maximal Relevance (mRMR) [11], but this can be impractical with la… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
2
1
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…We apply supervised feature selection to choose the features from this full set, and in particular we use Symmetric Uncertainty (SU) (Witten et al, 2011) and minimal Redundancy Maximal Relevancy (mRMR) selection (Peng, Long, & Ding, 2005;Hermana et al, 2013;Taylor et al, 2014). SU is a variant of Mutual Information (MI), that is normalized by the mean entropy of the two variables to mitigate the bias MI has towards features of high dimensionalities.…”
Section: Feature Selectionmentioning
confidence: 99%
See 2 more Smart Citations
“…We apply supervised feature selection to choose the features from this full set, and in particular we use Symmetric Uncertainty (SU) (Witten et al, 2011) and minimal Redundancy Maximal Relevancy (mRMR) selection (Peng, Long, & Ding, 2005;Hermana et al, 2013;Taylor et al, 2014). SU is a variant of Mutual Information (MI), that is normalized by the mean entropy of the two variables to mitigate the bias MI has towards features of high dimensionalities.…”
Section: Feature Selectionmentioning
confidence: 99%
“…This is repeated until a given number of features is chosen from the set. In this paper, as in Taylor et al (2014), we first select one extracted feature from each signal, before combining them in a second stage of selection. In the classification evaluations first fifteen selected features are used.…”
Section: Feature Selectionmentioning
confidence: 99%
See 1 more Smart Citation