2013
DOI: 10.1016/j.csda.2013.04.005
|View full text |Cite
|
Sign up to set email alerts
|

Conjugate and conditional conjugate Bayesian analysis of discrete graphical models of marginal independence

Abstract: Standard-Nutzungsbedingungen:Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Zwecken und zum Privatgebrauch gespeichert und kopiert werden.Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich machen, vertreiben oder anderweitig nutzen.Sofern die Verfasser die Dokumente unter Open-Content-Lizenzen (insbesondere CC-Lizenzen) zur Verfügung gestellt haben sollten, gelten abweichend von diesen Nutzungsbedingungen die in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 7 publications
(16 citation statements)
references
References 34 publications
0
16
0
Order By: Relevance
“…4 Probability Based MCMC Samplers 4.1 Initial Set-up and Data Augmentation for MCMC Following the notation of Ntzoufras and Tarantola (2013), we can divide the class of graphical log-linear marginal models in two major categories: homogeneous and non-homogeneous models. A bi-directed graph is named homogeneous if it does not include a bi-directed 4-chain or a chordless 4-cycle as subgraphs.…”
Section: Likelihood Specification and Posterior Inferencementioning
confidence: 99%
See 2 more Smart Citations
“…4 Probability Based MCMC Samplers 4.1 Initial Set-up and Data Augmentation for MCMC Following the notation of Ntzoufras and Tarantola (2013), we can divide the class of graphical log-linear marginal models in two major categories: homogeneous and non-homogeneous models. A bi-directed graph is named homogeneous if it does not include a bi-directed 4-chain or a chordless 4-cycle as subgraphs.…”
Section: Likelihood Specification and Posterior Inferencementioning
confidence: 99%
“…interactions). Moreover, we can exploit the advantages of the conditional conjugacy of the algorithm of Ntzoufras and Tarantola (2013) in order to obtain efficient proposals based on probability parameters.…”
Section: Likelihood Specification and Posterior Inferencementioning
confidence: 99%
See 1 more Smart Citation
“…The compatibility between the different transformation families is automatically introduced by the use of common imaginary data, as the power‐priors are nothing more than rescaled posterior distributions given the imaginary data boldy*. Similar strategies have been introduced in common model selection problems, such as the well‐known g‐prior of Zellner (), and in graphical models (Ntzoufras & Tarantola, , for example). Some interesting properties of this class of power‐priors are described in Ibrahim, Chen & Sinha ().…”
Section: Bayesian Formulationmentioning
confidence: 99%
“…The classic Bayes school uses different parametric distributions on different parts of θ according to the natures of learning tasks and empirical experiences. Typical examples are those of conjugate priors (Diaconis and Ylvisaker 1979;Ntzoufras and Tarantola 2013). Extensive studies along this line have been made in the machine learning literature, especially on Dirichlet-multinomial for Gaussian mixture.…”
Section: Bayes Approach and Automatic Model Selectionmentioning
confidence: 99%