2013
DOI: 10.1007/s11023-013-9322-6
|View full text |Cite
|
Sign up to set email alerts
|

Towards an Informational Pragmatic Realism

Abstract: I discuss the design of the method of entropic inference as a general framework for reasoning under conditions of uncertainty. The main contribution of this discussion is to emphasize the pragmatic elements in the derivation. More specifically: (1) Probability theory is designed as the uniquely natural tool for representing states of incomplete information.(2) An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. (3) The method of updating fr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
33
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 16 publications
(33 citation statements)
references
References 35 publications
(67 reference statements)
0
33
0
Order By: Relevance
“…Being informationally stingy, that we should only update probability distributions when the information requires it, pushes inductive inference toward objectivity. Thus, using the PMU helps formulate a pragmatic (and objective) procedure for making inferences using (informationally) subjective probability distributions [41].…”
Section: Introductionmentioning
confidence: 99%
“…Being informationally stingy, that we should only update probability distributions when the information requires it, pushes inductive inference toward objectivity. Thus, using the PMU helps formulate a pragmatic (and objective) procedure for making inferences using (informationally) subjective probability distributions [41].…”
Section: Introductionmentioning
confidence: 99%
“…Being informationally stingy, that we should only update probability distributions when the information requires it, pushes inductive inference toward objectivity. Thus using the PMU helps formulate a pragmatic (and objective) procedure for making inferences using (informationally) subjective probability distributions [28].…”
Section: The Design Of Entropic Inferencementioning
confidence: 99%
“…One is forced to address similar questions in the context of designing the relative entropy as a tool for updating probability distributions in the presence of new information (e.g., "What is information?") [1]. In the context of inference, correlations are broadly defined as being statistical relationships between propositions.…”
Section: Introductionmentioning
confidence: 99%
“…When one has incomplete information, the tools one must use for reasoning objectively are probabilities [1,2]. The relationships between different propositions x and y are quantified by a joint probability density, p(x, y) = p(x|y)p(y) = p(x)p(y|x), where the conditional distribution p(y|x) quantifies what one should believe about y given information about x, and vice-versa for p(x|y).…”
Section: Introductionmentioning
confidence: 99%