1998
DOI: 10.2307/3318659
|View full text |Cite
|
Sign up to set email alerts
|

Consistency of Bayes Estimates for Nonparametric Regression: Normal Theory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2000
2000
2018
2018

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 28 publications
(17 citation statements)
references
References 37 publications
0
17
0
Order By: Relevance
“…When unknown parameters are estimated using priors and sampled data, it is important to observe that the convergence of the Bayesian method may fail if the underlying probability mechanism allows an infinite number of possible outcomes (e.g., estimation of an unknown probability on N, the set of all natural numbers) [17]. In fact, in these infinite-dimensional situations, this lack of convergence (commonly referred to as inconsistency) is the rule rather than the exception [18]. As emphasized in [17], as more data comes in, some Bayesian statisticians will become more and more convinced of the wrong answer.…”
Section: Comparisons With Other Uq Methodsmentioning
confidence: 99%
“…When unknown parameters are estimated using priors and sampled data, it is important to observe that the convergence of the Bayesian method may fail if the underlying probability mechanism allows an infinite number of possible outcomes (e.g., estimation of an unknown probability on N, the set of all natural numbers) [17]. In fact, in these infinite-dimensional situations, this lack of convergence (commonly referred to as inconsistency) is the rule rather than the exception [18]. As emphasized in [17], as more data comes in, some Bayesian statisticians will become more and more convinced of the wrong answer.…”
Section: Comparisons With Other Uq Methodsmentioning
confidence: 99%
“…However, there are also analytical reasons to be careful about the application of Bayesian methods [88,76,43]. It is, in fact, now well understood that Bayesian methods may fail to converge or may converge towards the wrong solution if the underlying probability mechanism allows an infinite number of possible outcomes [35] and that, in these non-finite-probability-space situations, this lack of convergence (commonly referred to as Bayesian inconsistency) is the rule rather than the exception [36]. There is now a wide literature of positive [19,30,38,67,69,96,92] and negative results [12,35,48,47,61,71] on the consistency properties of Bayesian inference in parametric and non-parametric settings, and an emerging understanding of the fine topological and geometrical properties that determine (in)consistency.…”
Section: Bayesian Inconsistency and Model Misspecificationmentioning
confidence: 99%
“…Although it is known from the results of Diaconis and Freedman that the Bayesian method may fail to converge or may converge towards the wrong solution (i.e., be inconsistent) if the underlying probability mechanism allows an infinite number of possible outcomes [14] and that in these non-finite-probability-space situations, this lack of convergence (commonly referred to as Bayesian inconsistency) is the rule rather than the exception [15], it is also known, from the Bernstein-Von Mises Theorem [7,42] (see also LeCam [28]), that consistency (convergence upon observation of sample data) does indeed hold, under some regularity conditions, if the data-generating distribution of the sample data belongs to the finite dimensional family of distributions parameterized by the model. Furthermore, although it is also known that this convergence may fail under model misspecification [43,21,32,1,2,26,29,22] (i.e.…”
Section: Introductionmentioning
confidence: 99%