2013
DOI: 10.48550/arxiv.1305.5493
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Information Criteria for Deciding between Normal Regression Models

Abstract: Regression models fitted to data can be assessed on their goodness of fit, though models with many parameters should be disfavored to prevent over-fitting. Statisticians' tools for this are little known to physical scientists. These include the Akaike Information Criterion (AIC), a penalized goodness-of-fit statistic, and the AICc, a variant including a small-sample correction. They entered the physical sciences through being used by astrophysicists to compare cosmological models; e.g., predictions of the dist… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 45 publications
(50 reference statements)
0
1
0
Order By: Relevance
“…We experimented with the number of terms in (4) by running the optimisation code NLopt (Johnson 2010 1 , Powell 2009 2 ) for a fixed number of objective function evaluations and various N and compared them using the adjusted Akaike information criterion (e.g. Maier 2013), which produced more consistent results compared to similarly used Bayesian information criteria. In most cases the optimum value turned out to be N = 1 and it was never above four; moreover, the results for r i did not seem to be much affected if just a single term was used.…”
Section: Variability Rate Inferencementioning
confidence: 99%
“…We experimented with the number of terms in (4) by running the optimisation code NLopt (Johnson 2010 1 , Powell 2009 2 ) for a fixed number of objective function evaluations and various N and compared them using the adjusted Akaike information criterion (e.g. Maier 2013), which produced more consistent results compared to similarly used Bayesian information criteria. In most cases the optimum value turned out to be N = 1 and it was never above four; moreover, the results for r i did not seem to be much affected if just a single term was used.…”
Section: Variability Rate Inferencementioning
confidence: 99%