2007
DOI: 10.1109/tnn.2006.882813
|View full text |Cite
|
Sign up to set email alerts
|

The AIC Criterion and Symmetrizing the Kullback–Leibler Divergence

Abstract: The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as an asymptotically unbiased estimator of a function used for ranking candidate models which is a variant of the Kullback-Leibler divergence between the true model and the approximating candidate model. Despite the Kullback-Leibler's computational and theoretical advantages, what can become inconvenient in model selection applications is their lack of symmetry. Simple examples can show that reversing the role of t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
54
0
2

Year Published

2007
2007
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 134 publications
(56 citation statements)
references
References 25 publications
0
54
0
2
Order By: Relevance
“…The pertinent variables of the regression model are determined through Stepwise procedure (Bendel and Afifi 1977) which eliminates variables that are not statistically significant in the model from the Akaike Information Criterion (this criterion gives the information lost for each candidate model and the pertinent model is the one with low AIC) (Seghouane and Amari 2007). In addition, the significance of the correlation between the selected pertinent variables is assessed with the Pearson test of correlation (Millot 2009).…”
Section: Elaboration Of the Multiple Linear Regression Of The Annual mentioning
confidence: 99%
“…The pertinent variables of the regression model are determined through Stepwise procedure (Bendel and Afifi 1977) which eliminates variables that are not statistically significant in the model from the Akaike Information Criterion (this criterion gives the information lost for each candidate model and the pertinent model is the one with low AIC) (Seghouane and Amari 2007). In addition, the significance of the correlation between the selected pertinent variables is assessed with the Pearson test of correlation (Millot 2009).…”
Section: Elaboration Of the Multiple Linear Regression Of The Annual mentioning
confidence: 99%
“…In the experiments we assumed a fixed number of hidden units. However, there is plenty of works on how to choose proper structure of a model using the model selection with different complexity penalty terms [2], naming only a few, Akaike's information theoretic criterion [1,32] or Rissanen's minimum description length [29], Last but not least, in this work we considered a probabilistic deep model. However, there is a vast of deterministic deep models based on auto-encoders.…”
Section: Resultsmentioning
confidence: 99%
“…The Kullback-Leibler's measure can be understood like a comparison criterion between two distributions. In this section, we derive two classes of entropy measures and one class of divergence measures which can be understood as new goodness-of-fit quantities such those discussed by Seghouane and Amari (2007). All these measures are defined for one element or between two elements in the MOEW family.…”
Section: Information Theory Measuresmentioning
confidence: 99%