2013
DOI: 10.1007/s11229-013-0335-8
|View full text |Cite
|
Sign up to set email alerts
|

The principle of maximum entropy and a problem in probability kinematics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 12 publications
0
1
0
Order By: Relevance
“…Associated with the Log score via McCarthy's theorem (Theorem 2.2) is the Kullback-Leibler divergence, which is the most promising concept of difference for probability distributions in information theory and the one which gives us Bayesian standard conditioning as well as Jeffrey conditioning (see Lukits, 2013). It is noncommutative and may provide the kind of asymmetry required to reflect epistemic asymmetry.…”
Section: Expectations For Information Theorymentioning
confidence: 99%
“…Associated with the Log score via McCarthy's theorem (Theorem 2.2) is the Kullback-Leibler divergence, which is the most promising concept of difference for probability distributions in information theory and the one which gives us Bayesian standard conditioning as well as Jeffrey conditioning (see Lukits, 2013). It is noncommutative and may provide the kind of asymmetry required to reflect epistemic asymmetry.…”
Section: Expectations For Information Theorymentioning
confidence: 99%