2022
DOI: 10.1080/00031305.2022.2139293
|View full text |Cite
|
Sign up to set email alerts
|

Bayes Factors and Posterior Estimation: Two Sides of the Very Same Coin

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 29 publications
0
2
0
Order By: Relevance
“…For example, with equal prior model probabilities, a BF 10 equal to 0.20 indicates that the null model is five times more likely than the alternative model. Going forward, we suppose that the equal prior model probabilities (Pr(Model 0) = Pr(Model 1) = 0.5) are always assumed, as is often (implicitly) done in practice; but see Tendeiro and Kiers (2019) and Campbell and Gustafson (2023) for discussion of this practice. Bayesian methods require one to define appropriate prior distributions for all model parameters (Consonni & Veronese, 2008).…”
Section: A Bayesian Alternative For Establishing Equivalence In a Lin...mentioning
confidence: 99%
“…For example, with equal prior model probabilities, a BF 10 equal to 0.20 indicates that the null model is five times more likely than the alternative model. Going forward, we suppose that the equal prior model probabilities (Pr(Model 0) = Pr(Model 1) = 0.5) are always assumed, as is often (implicitly) done in practice; but see Tendeiro and Kiers (2019) and Campbell and Gustafson (2023) for discussion of this practice. Bayesian methods require one to define appropriate prior distributions for all model parameters (Consonni & Veronese, 2008).…”
Section: A Bayesian Alternative For Establishing Equivalence In a Lin...mentioning
confidence: 99%
“…Once the posterior probabilities (2) have been computed, a summary of evidence provided by the observed data in favor of model M0$$ {M}_0 $$ is given by the Bayes factor B01$$ {B}_{01} $$ (see, e.g., Kass and Raftery 3 and Campbell and Gustafson 4 ), which is defined as the ratio of the posterior odds of M0$$ {M}_0 $$ to its prior odds: B01goodbreak=Pr{}M0|datafalse/Pr{}M1|dataPr{}M0false/Pr{}M10.25em.$$ {B}_{01}=\frac{\Pr \left\{{M}_0\mid \mathrm{data}\right\}/\Pr \left\{{M}_1\mid \mathrm{data}\right\}}{\Pr \left\{{M}_0\right\}/\Pr \left\{{M}_1\right\}}. $$ When the models M0$$ {M}_0 $$ and M1$$ {M}_1 $$ are equally prior probable, so that Pr{}M0=Pr{}M1=0.5$$ \Pr \left\{{M}_0\right\}=\Pr \left\{{M}_1\right\}=0.5 $$, then the Bayes factor is equal to the posterior odds Pr{}M0|datafalse/Pr{}M1|data$$ \Pr \left\{{M}_0\mid \mathrm{data}\right\}/\Pr \left\{{M}_1\mid \mathrm{data}\right\} $$.…”
Section: Model Selection and Bayes Factormentioning
confidence: 99%