2020
DOI: 10.1145/3409382
|View full text |Cite
|
Sign up to set email alerts
|

Causality-based Feature Selection

Abstract: Feature selection is a crucial preprocessing step in data analytics and machine learning. Classical feature selection algorithms select features based on the correlations between predictive features and the class variable and do not attempt to capture causal relationships between them. It has been shown that the knowledge about the causal relationships between features and the class variable has potential benefits for building interpretable and robust prediction models, since causal relationships imply the und… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
70
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 125 publications
(72 citation statements)
references
References 52 publications
0
70
0
2
Order By: Relevance
“…1 displays the results for the three test cases (TC) used in Yu et al (2018). The standard metrics (Aliferis et al (2010b); Yu et al (2018); Yu et al (2019)) were applied:…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…1 displays the results for the three test cases (TC) used in Yu et al (2018). The standard metrics (Aliferis et al (2010b); Yu et al (2018); Yu et al (2019)) were applied:…”
Section: Methodsmentioning
confidence: 99%
“…It combines the strategies of simultaneous and divide-and-conquer algorithms to derive a method capable of simultaneously learning the MB while distinguishing PC from spouses. Yu et al (2019) is an extensive review on DAG MB learning algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…The methods by selecting and transforming multi-dimensional features into low-dimensional features have been proposed to improve the quality of features, reduce computational complexity, and improve recognition accuracy. [92][93][94][95][96][97][98] In present machine learning feature engineering, these two powerful technologies have shown great potential in various applications, such as the text data, [99] planetary system metrics, [100] clinical medicine, [101] human hand motion classification, [102] human activity recognition, [103] etc. In materials science, the transformation in various features is also important, which might directly affect the analytical accuracy in structure and properties.…”
Section: Strengthening the Analysis And Optimization On Featuresmentioning
confidence: 99%
“…In traditional MB discovery techniques, features must be present before learning begins. Different algorithms developed for MB learning, which is based on traditional concepts such as Incremental Association-Based Markov Blanket (IAMB) [16], Max-Min Markov Blanket (MMMB) [17], HITON-MB (HITON-MB) [18], Simultaneous Markov Blanket (STMB) [19], Iterative Parent-Child-based MB (IPCMB) [19], Balanced Markov Blanket (BAMB) [6], and Efficient and Effective Markov Blanket (EEMB) [7]. While MB is learning the Parents-Child and Spouses of the target feature, T cannot differentiate by the IAMB [16] algorithm.…”
Section: Related Workmentioning
confidence: 99%