2022
DOI: 10.1088/1361-6633/aca6f8
|View full text |Cite
|
Sign up to set email alerts
|

Information geometry for multiparameter models: new perspectives on the origin of simplicity

Abstract: Complex models in physics, biology, economics, and engineering are often ill-determined or sloppy: their multiple parameters can vary over wide ranges without significant changes in their predictions. This review uses the tools of information geometry to explore this phenomenon, and the deep relations between sloppiness and emergent theories. We introduce information geometry, the model manifold of predictions whose coordinates are the model parameters, and its hyperribbon structure. These hyperribbons explain… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 119 publications
0
6
0
Order By: Relevance
“…Another potentially intriguing future direction is a comparison with other formal approaches to the emergence of simplicity that can lead to different predictions. Recent studies have argued that Jeffrey’s prior (upon which our geometric approach is based) could give an incomplete picture of the complexity of a class of models that occur commonly in the natural sciences, which contain many combinations of parameters that do not affect model behavior, and proposed instead the use of data-dependent priors ( 43 , 44 ). The two methods lead to different results, especially in the data-limited regime ( 45 ).…”
Section: Discussionmentioning
confidence: 99%
“…Another potentially intriguing future direction is a comparison with other formal approaches to the emergence of simplicity that can lead to different predictions. Recent studies have argued that Jeffrey’s prior (upon which our geometric approach is based) could give an incomplete picture of the complexity of a class of models that occur commonly in the natural sciences, which contain many combinations of parameters that do not affect model behavior, and proposed instead the use of data-dependent priors ( 43 , 44 ). The two methods lead to different results, especially in the data-limited regime ( 45 ).…”
Section: Discussionmentioning
confidence: 99%
“…While our concern here is with the ideal properties, for practical use nearly optimal approximations may be required. One possibility is the adaptive slab-and-spike prior introduced in [ 6 ]. Another would be to use some variational family with adjustable meta-parameters [ 41 ].…”
Section: Discussionmentioning
confidence: 99%
“…This is the approach of [ 41 ], and of many papers maximizing other scores, often described as “variational”. Another prior that typically has large was introduced in [ 6 ] under the name “adaptive slab-and-spike prior”. It pulls every point x in a distribution back to its maximum likelihood point : The result has weight everywhere in the model manifold, but has extra weight on the edges.…”
Section: Appendix A1 Square Hyperconementioning
confidence: 99%
See 1 more Smart Citation
“…Further, they show that specific models are not sloppy at microscopic scales, and sloppiness emerges only when fit to collective behaviour. Transtrum et al and Katherine et al suggest that the existence of sloppiness (dependence of model parameters to only a few macroscopic parameter directions) is the origin of simplicity in science and responsible for the emergence of comprehensible macroscopic theories from highly complex microscopic processes and [34,35]. Evangelou et al have developed a method to identify important and unimportant parameter combinations using manifold learning techniques (Dmaps).…”
Section: Impact Of Sloppiness In Optimization Experiments Design and ...mentioning
confidence: 99%