2019
DOI: 10.1109/tits.2018.2881284
|View full text |Cite
|
Sign up to set email alerts
|

Maximal Information Coefficient-Based Two-Stage Feature Selection Method for Railway Condition Monitoring

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 43 publications
(15 citation statements)
references
References 21 publications
0
15
0
Order By: Relevance
“…The MIC is an effective metric to evaluate the nonlinear dependence of two variables, which was proposed by Reshef et al [ 33 ] The MIC belongs to a class of maximal information based nonparametric exploration statistics and has been widely applied in variable relationship analysis. [ 34,35 ] For two variables x and y , the MIC is computed as follows [ 36,37 ] : italicMIC()x,y=italicmaxI()x,ylog2italicmin(),nxny where I()x,y is the mutual information between variables x and y and n x and n y are the number of bins required to partition x and y , respectively. According to Reshef et al, [ 33 ] bin numbers n x and n y should satisfy the condition of nx×ny<n0.6, where n is the sample number of the tested variable pair.…”
Section: Improved Rffa Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The MIC is an effective metric to evaluate the nonlinear dependence of two variables, which was proposed by Reshef et al [ 33 ] The MIC belongs to a class of maximal information based nonparametric exploration statistics and has been widely applied in variable relationship analysis. [ 34,35 ] For two variables x and y , the MIC is computed as follows [ 36,37 ] : italicMIC()x,y=italicmaxI()x,ylog2italicmin(),nxny where I()x,y is the mutual information between variables x and y and n x and n y are the number of bins required to partition x and y , respectively. According to Reshef et al, [ 33 ] bin numbers n x and n y should satisfy the condition of nx×ny<n0.6, where n is the sample number of the tested variable pair.…”
Section: Improved Rffa Methodsmentioning
confidence: 99%
“…The MIC is an effective metric to evaluate the nonlinear dependence of two variables, which was proposed by Reshef et al [33] The MIC belongs to a class of maximal information based nonparametric exploration statistics and has been widely applied in variable relationship analysis. [34,35] For two variables x and y, the MIC is computed as follows [36,37] :…”
Section: Two-level Localization Strategymentioning
confidence: 99%
“…The change rates of the measures which changed significantly after ctDCS for the active group with response to ctDCS but didn’t change significantly for the active group without response to ctDCS were used as the features to predict the treatment outcome of ctDCS. A two-step feature selection procedure was utilized to select the most informative features for the best prediction performance ( Wei et al, 2019 ; Wen et al, 2019 ). In the first step, we calculated the correlations between every feature and class labels using the maximal information coefficient (MIC) and the features were ranked according to their MIC values ( Reshef et al, 2011 ).…”
Section: Methodsmentioning
confidence: 99%
“…Maximum information coefficient (MIC) 29 is proposed by Reshef in 2011. MIC can find the correlation of various types of variables by obtaining the maximum normalized mutual information under different grid divisions, which has higher universality and fairness 30‐31 . For STLF with multiple factors and variables, MIC can accurately measure correlation between variables.…”
Section: Methodologiesmentioning
confidence: 99%