Bayesian Statistics 9 2011
DOI: 10.1093/acprof:oso/9780199694587.003.0018
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Models for Sparse Regression Analysis of High Dimensional Data*

Abstract: This paper considers the task of building efficient regression models for sparse multivariate analysis of high dimensional data sets, in particular it focuses on cases where the numbers q of responses Y = (y k , 1 ≤ k ≤ q) and p of predictors X = (x j , 1 ≤ j ≤ p) to analyse jointly are both large with respect to the sample size n, a challenging bi-directional task. The analysis of such data sets arise commonly in genetical genomics, with X linked to the DNA characteristics and Y corresponding to measurements … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
87
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 45 publications
(88 citation statements)
references
References 13 publications
1
87
0
Order By: Relevance
“…In the following we provide some technical details omitted from the main text of the local and global moves that we found useful to implement. At each sweep of the algorithm each/both of moves can be applied to all the q regression equations or to a random without replacement subgroup of them (see Richardson et al (2011) for alternative subgroup selection with adaptive probability).…”
Section: S22 γ Updatementioning
confidence: 99%
“…In the following we provide some technical details omitted from the main text of the local and global moves that we found useful to implement. At each sweep of the algorithm each/both of moves can be applied to all the q regression equations or to a random without replacement subgroup of them (see Richardson et al (2011) for alternative subgroup selection with adaptive probability).…”
Section: S22 γ Updatementioning
confidence: 99%
“…Adaptive MCMC algorithms have recently been used in a number of different statistical applications, and often lead to significant speed-ups, even in hundreds of dimensions; see, e.g., Rosenthal (2009), Craiu et al (2009), Giordani and Kohn (2010) and Richardson et al (2011). On the other hand, adaptive MCMC algorithms use previous iterations to determine their future transitions, so they violate the Markov property which provides the justification for conventional MCMC.…”
Section: Adaptive Mcmcmentioning
confidence: 99%
“…In particular, Roberts and Rosenthal (2007) show that adaptive algorithms will still converge to the target density π provided they satisfy two fairly mild conditions: "Diminishing Adaptation" (the algorithm adapts by less and less as time goes on), and "Containment" (the chain never gets too lost, in the sense that it remains bounded in probability). Conditions such as these have been used to formally justify adaptive algorithms in many examples; see, e.g., Roberts and Rosenthal (2009) and Richardson et al (2011). Adaptive MCMC appears to hold great promise for improving statistical computation in many application areas in the years ahead.…”
Section: Adaptive Mcmcmentioning
confidence: 99%
“…Gibbs samplers can be implemented in either a deterministic scan or a random scan, among other variants (Johnson et al, 2013;Liu et al, 1994). Although deterministic scan MCMC algorithms are currently popular in the statistics literature, random scan algorithms were some of the first used in MCMC settings (Geman and Geman, 1984;Metropolis et al, 1953) and remain useful in applications (Lee et al, 2013;Richardson et al, 2010). Random scan Gibbs samplers can also be implemented adaptively while the deterministic scan version cannot.…”
mentioning
confidence: 99%