1982
DOI: 10.1080/01621459.1982.10477893
|View full text |Cite
|
Sign up to set email alerts
|

Updating Subjective Probability

Abstract: Jeffrey's rule for revising a probability P to a new probability P* based on new probabilities P * ( E , ) on a partition ( E l } i = l " is P * ( A ) = 2 P ( A I E;) P * ( E , ) . Jeffrey's rule is applicable if it is judged that P * ( A I E;) = P ( A I E;) for all A and i . This article discusses some of the mathematical properties of this rule, connecting it with sufficient partitions, and maximum entropy updating of contingency tables. The main results concern simultaneous revision on two partitions.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
128
0
2

Year Published

1994
1994
2015
2015

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 236 publications
(133 citation statements)
references
References 29 publications
3
128
0
2
Order By: Relevance
“…Moreover, among all probability measures R on A satisfying R E 1;j 1 \ E 2;j 2 À \ Á Á Á \E k;j k Þ ¼ l E 1;j 1 \ E 2;j 2 \ Á Á Á \ E k;j k À Á , and hence preserving the independence of the partitions E (1) ,…,E (k) , Q is nearest to P on several measures of closeness, including the variation distance, the Hellinger distance, and Kullback-Leibler divergence, in the latter two cases, uniquely so (see Diaconis and Zabell 1982).…”
Section: Independence Preservation 283mentioning
confidence: 96%
“…Moreover, among all probability measures R on A satisfying R E 1;j 1 \ E 2;j 2 À \ Á Á Á \E k;j k Þ ¼ l E 1;j 1 \ E 2;j 2 \ Á Á Á \ E k;j k À Á , and hence preserving the independence of the partitions E (1) ,…,E (k) , Q is nearest to P on several measures of closeness, including the variation distance, the Hellinger distance, and Kullback-Leibler divergence, in the latter two cases, uniquely so (see Diaconis and Zabell 1982).…”
Section: Independence Preservation 283mentioning
confidence: 96%
“…This property, however, is not unique to cross-entropy minimization (Diaconis and Zabell 1982). Justifications that identify cross-entropy minimization as the unique method satisfying certain desirable properties can be brought forward along two distinct lines: the first type of argument consists of formal conditions on the input/output relation defined by a method, and a proof that cross-entropy minimization is the only rule that will satisfy these conditions.…”
Section: Inductive Reasoning By Cross-entropy Minimizationmentioning
confidence: 98%
“…Furthermore, an iterated proportional fitting procedure may be devised to determine the weights [17]; this is a maximum entropy procedure with many variants and extensions [7]. Another advantage is that the computation of the function f remains a simple summation procedure, the same number of weights are used, and the weights can be made to desirable constraints such as falling with certain pre-specified intervals.…”
Section: Extensionsmentioning
confidence: 99%