2006
DOI: 10.1198/016214505000000646
|View full text |Cite
|
Sign up to set email alerts
|

Objective Bayesian Variable Selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
143
0

Year Published

2008
2008
2021
2021

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 121 publications
(143 citation statements)
references
References 1 publication
0
143
0
Order By: Relevance
“…Nonetheless, posterior probabilities (2) can still be formally defined using the Bayes factor with respect to the full model (8). A similar formulation, where the prior on the full model depends on which hypothesis is being tested, has also been adapted by Casella and Moreno (2006) in the context of intrinsic Bayes factors. Their rationale is that the full model is the scientific "null" and that all models should be judged against it.…”
Section: Full-based Bayes Factorsmentioning
confidence: 99%
See 2 more Smart Citations
“…Nonetheless, posterior probabilities (2) can still be formally defined using the Bayes factor with respect to the full model (8). A similar formulation, where the prior on the full model depends on which hypothesis is being tested, has also been adapted by Casella and Moreno (2006) in the context of intrinsic Bayes factors. Their rationale is that the full model is the scientific "null" and that all models should be judged against it.…”
Section: Full-based Bayes Factorsmentioning
confidence: 99%
“…Our last example uses the ground level ozone data analyzed in Breiman and Friedman (1985), and more recently by Miller (2001) and Casella and Moreno (2006). The dataset consists of daily measurements of the maximum ozone concentration near Los Angeles and 8 meteorological variables (the description for the variables is in Appendix C).…”
Section: Ozonementioning
confidence: 99%
See 1 more Smart Citation
“…We believe that our methodology can be successfully applied in this context, with the help of a suitable search algorithm over the space of all models. Since our approach is based on a pairwise comparison of nested models, some form of encompassing is required, if an MCMC strategy is adopted; see, in the context of variable selection, Liang et al (2008) using mixtures of g-priors, or Casella and Moreno (2006) using intrinsic priors. An alternative option is to use a Feature-Inclusion Stochastic Search, as implemented in Scott and Carvalho (2006) for undirected decomposable graphical models.…”
Section: Discussionmentioning
confidence: 99%
“…[19], which is a reversible jump 20 style algorithm, with a model comparison approach using modified fractional Bayes factor. 21,22 The basic idea is to consider a finite number of possible C, denoted by {C min , . .…”
Section: Posterior Computationmentioning
confidence: 99%