2014 International Joint Conference on Neural Networks (IJCNN) 2014
DOI: 10.1109/ijcnn.2014.6889561
|View full text |Cite
|
Sign up to set email alerts
|

Variable selection for regression problems using Gaussian mixture models to estimate mutual information

Abstract: Abstract-Variable selection is a crucial part of building regression models, and is preferably done as a filtering method independently from the model training. Mutual information is a popular relevance criterion for this, but it is not trivial to estimate accurately from a limited amount of data. In this paper, a method is presented where a Gaussian mixture model is used to estimate the joint density of the input and output variables, and subsequently used to select the most relevant variables by maximising t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…The idea of using Mutual Information to carry out feature selection has been proposed in the literature in several ways [ 28 , 29 , 30 , 31 ]. Let and for be the input sample pairs, then, the Mutual Information between these two variables can be defined as the amount of information that can be extracted from Y given X .…”
Section: Experiments and Discussionmentioning
confidence: 99%
“…The idea of using Mutual Information to carry out feature selection has been proposed in the literature in several ways [ 28 , 29 , 30 , 31 ]. Let and for be the input sample pairs, then, the Mutual Information between these two variables can be defined as the amount of information that can be extracted from Y given X .…”
Section: Experiments and Discussionmentioning
confidence: 99%
“…To address these requirements, we present Gaussian mixture model (GMM)-MI (pronounced 'Jimmie'), an algorithm to estimate the full distribution of I(X, Y) based on fitting samples drawn from the distribution with GMMs. While the use of GMMs to estimate MI is not new [81][82][83][84][85][86], these previous works only considered MI in the context of feature selection, and did not carry out uncertainty quantification on the relevant MI estimates, which is critical when using MI to interpret DL models. GMM-MI has been designed to be a robust and flexible tool that can be applied to multiple settings where MI estimation is required.…”
Section: Introductionmentioning
confidence: 99%
“…To address these requirements, we present GMM-MI (pronounced "Jimmie"), an algorithm to estimate the full distribution of I(X, Y ) based on fitting samples drawn from the distribution with Gaussian mixture models (GMMs). While the use of GMMs to estimate MI is not new [81][82][83][84][85][86], these previous works only considered MI in the context of feature selection, and did not carry out uncertainty quantification on the relevant MI estimates. GMM-MI has been designed to be a flexible tool that can be applied to multiple settings where MI estimation is required.…”
Section: Introductionmentioning
confidence: 99%